You are on page 1of 462

Online Consumer

Protection:

Theories of Human Relativism


Kuanchin Chen
Western Michigan University, USA
Adam Fadlalla
Cleveland State University, USA

InformatIon scIence reference


Hershey New York

Director of Editorial Content:


Senior Managing Editor:
Managing Editor:
Managing Development Editor:
Assistant Managing Editor:
Typesetter:
Editorial Assistant:
Copy Editor
Cover Design:
Printed at:

Kristin Klinger
Jennifer Neidig
Jamie Snavely
Kristin M. Roth
Carole Coulson
Chris Hrobak
Rebecca Beistline
Joy Langel
Lisa Tosheff
Yurchak Printing Inc.

Published in the United States of America by


Information Science Reference (an imprint of IGI Global)
701 E. Chocolate Avenue, Suite 200
Hershey PA 17033
Tel: 717-533-8845
Fax: 717-533-8661
E-mail: cust@igi-global.com
Web site: http://www.igi-global.com
and in the United Kingdom by
Information Science Reference (an imprint of IGI Global)
3 Henrietta Street
Covent Garden
London WC2E 8LU
Tel: 44 20 7240 0856
Fax: 44 20 7379 0609
Web site: http://www.eurospanbookstore.com
Copyright 2009 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by
any means, electronic or mechanical, including photocopying, without written permission from the publisher.
Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or companies does
not indicate a claim of ownership by IGI Global of the trademark or registered trademark.
Library of Congress Cataloging-in-Publication Data
Online consumer protection : theories of human relativism / Kuanchin Chen and Adam Fadlalla, editors.
p. cm.
Summary: "This book is designed to offer readers a comprehensive way to understand the nature of online threats, consumer concerns, and
techniques for online privacy protection"--Provided by publisher.
Includes bibliographical references and index.
ISBN 978-1-60566-012-7 (hardcover) -- ISBN 978-1-60566-013-4 (ebook)
1. Consumer protection. 2. Ethical relativism. 3. Privacy, Right of. 4. Electronic commerce--Security measures. 5. Electronic
information resources--Access control. 6. Disclosure of information. 7. Computer crimes. I. Chen, Kuanchin. II. Fadlalla, Adam.
HC79.C63O54 2009
381.3'402854678--dc22
2008010313
British Cataloguing in Publication Data
A Cataloguing in Publication record for this book is available from the British Library.
All work contributed to this book set is original material. The views expressed in this book are those of the authors, but not necessarily of
the publisher.
If a library purchased a print copy of this publication, please go to http://www.igi-global.com/agreement for information on activating
the library's complimentary electronic access to this publication.

Table of Contents

Preface ................................................................................................................................................ xiv


Acknowledgment ................................................................................................................................ xx

Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA

Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark

Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA

Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan

Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA

Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan

Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA

Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA

Compilation of References .............................................................................................................. 388


About the Contributors ................................................................................................................... 429
Index ................................................................................................................................................ 436

Detailed Table of Contents

Preface ................................................................................................................................................ xiv


Acknowledgment ................................................................................................................................ xx

Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Search engines can log and stamp each search made by end-users and use that collected data for an
assortment of business advantages. In a world where technology gives users many conveniences, one
must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
With Googles eyes on moving into radio, television, print, and other technologies, one must back up
and examine the potential privacy risks associated with the technological conveniences being provided
by Google. This chapter gives an overview of Googles services and how they are related to personal
privacy online
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
This chapter offers a review of online privacy issues, particularly in the areas of consumer online privacy
legislation and litigation, relationship among the privacy issues, legal protections, and risks for privacy
violations. A survey into the privacy literature provides insights on privacy protection and privacy concern. Results show a need for a stronger intervention by the government and the business community.
Consumers privacy awareness is also the key to a successful protection online. This chapter is concluded
with a call for consumer privacy education to promote privacy awareness, and for government and businesses timely responses to privacy violations.

Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA
Victims of online threats are not necessarily just individual consumers anymore. Management must be
ready to neutralize, reduce, and prevent these threats if the organization is going to maintain its viability in todays business environment. This chapter is designed to give managers an overview needed to
have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees.

Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark
This chapter proposes a privacy assessment model called the operational privacy assessment model
that includes organizational, operational, and technical factors for the protection of personal data stored
in an IT system. The factors can be evaluated in a simple scale so that not only the resulting graphical
depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems
are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment
tool may be used to standardize privacy assessment criteria, making it less painful for the management
to assess privacy risks on their systems
Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Trust is an essential ingredient for a successful interaction or collaboration among different parties. Trust
is also built upon the belief that the privacy of the involved parties is protected before, during, and after
the interaction. This chapter presents different trust models, the interplay between trust and privacy, and
the metrics for these two related concepts. In particular, it shows how ones degree of privacy can be
traded for a gain in the level of trust perceived by the interaction partner. The idea and mechanisms of
trading privacy for trust are also explored.

Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Australia uses regulation/legislation, guidelines, codes of practice, and activities of consumer associations
and the private sector to enhance protection of consumers privacy. This chapter is designed to report
Australians experience in privacy protection. In particular, the four main areas of protection outlined
above are analyzed to draw implications. Recommendations include areas in coverage of legislation,
uniformity of regulations, relationships among guidelines and legislation, and consumer awareness
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA
This chapter proposes an integrated framework to model online privacy protection behavior. Factors
in this framework are drawn from recent Internet and online privacy studies. Although many possible
factors can be included in the framework, the authors took a very conservative approach to include in
their framework only those factors that were formally studied in the academic literature. This framework
serves as the basis for future extensions or empirical assessments

Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
One main reason that online users are wary of providing personal information is because they lack trust
in e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. This chapter presents survey results of
how the two genders differ in their ways to control their private data on the Internet. Findings provide
guidelines for e-businesses to adjust their privacy policies and practices to increase information and
transactional exchanges

Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Research about hackers is scarce, but the impact on privacy that hackers bring to the Internet world should
not be underestimated. This chapter looks at the demographics, psychological predispositions, and social/
behavioral patterns of computer hacker to better understand the harms that can be caused. Results show that
online breaches and online concerns regarding privacy, security, and trust will require much more complex
solutions than currently exist. Teams of experts in fields such as psychology, criminology, law, and information technology security need to collaborate to bring about more effective solutions for the virtual world.
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan
This chapter introduces a situational paradigm to study online privacy. Online privacy concerns and
practices are examined within two contexts: technology platforms and users motivations. Results show
a distinctive staging phenomenon under the theory of uses and gratifications, and a priori theoretical framework. Diffused audience was concerned less about privacy but they did not disclose their
personal information any more than the other groups. Users may act differently in diverse platforms or
environments, implying that treating Internet users as a homogeneous group or considering them to act
the same way across different environments is a problematic assumption.

Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA
The business values of using the Internet for the delivery of soft media may be hampered when the owners
risk losing control of their intellectual property. Any business that wishes to control access to and use of
its intellectual property is a potential user of digital rights management (DRM) technologies. Managing,
preserving, and distributing digital content through DRM is not without problems. This chapter offers
a critical review of DRM and issues surrounding its use

Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Certain marketing practices may sometimes cause privacy conflicts between businesses and consumers.
This chapter offers insights into privacy concerns from todays marketing practices on the Internet. Specifically, areas of focus include current privacy issues, the use of spyware and cookies, word-of-mouth
marketing, online marketing to children, and the use of social networks. Related privacy practices,
concerns, and recommendations are presented from the perspectives of Internet users, marketers, and
government agencies.
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
This chapter examines the current status of online privacy policies of Fortune 100 companies. Results
show that 94% of the surveyed companies have posted an online privacy policy and 82% collect personal
information from consumers. Additionally, the majority of the companies only partially follow the four
principles (notice, choice, access, and security) of fair information practices. Few organizations have
obtained third-party privacy seals including TRUSTe, BBBOnline Privacy, and Safe Harbor
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan
This chapter studies concerns of Internet privacy across multiple cultures. Students from several countries were recruited to participate in the focus group study in order to discover the differences of their
privacy concerns. Collectivistic cultures appear to be less sensitive to the violation of personal privacy;
while the individualistic cultures are found to be more proactive in privacy protection. Implications are
provided.

Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA

This chapter provides an overview of biometric controls to protect individual privacy. Although much of
the discussion targets protection of physical privacy, some may also apply to online consumer privacy.
Discussion is focused in four main areas, technological soundness, economic values, business applications, and legal/ethical concerns. Further insights are provided
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
This chapter focuses on the issues surrounding the federal Freedom of Information Act and associated
state and local laws for their implications on personal privacy. Despite the good intentions of these laws
to enable openness in government, confidential business information and private personal information
may be vulnerable when data are in government hands. This chapter offers the readers a better understanding of the several trends regarding the statutes and their interpretations
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
This chapter discusses the legal framework and the law of the European Union (EU) for consumer and
data protection. The creation of legal frameworks in Europe aims to secure the protection of consumers
while simultaneously facilitating economic growth in the European Union. This chapter outlines the main
sources of privacy protection law and critically analyzes the important provisions in these sources of law.
Gaps and deficiencies in the legal structures for consumer and data protection are also discussed
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA
This chapter provides an overview of law relating to Internet medical practice, data protection, and consumer information privacy. It provides a comprehensive overview of federal (HIPAA) and state privacy
laws. Readers are given advice to the legal and data protection problems consumers will encounter in
purchasing medical and health services on the Internet. Furthermore, actual case studies and expert advice
are provided to offer a safer online experience. The authors also advocate that the United States must
enact more federal protection for the consumer in order to deter privacy violations and punish criminal,
negligent, and willful violations of personal consumer privacy.

Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA
This chapter reports the current status and practices of online privacy protection in Japan. It offers a
perspective of an eastern culture regarding the concept of privacy, its current practices, and how it is
protected. Following the discussion of the Japanese privacy law called Act on the Protection of Personal
Information, Japans privacy protection mechanisms to support and implement the new act are examined.
The authors also offer a four-stage privacy protection solution model as well as two case studies to show
readers the problems, dilemmas, and solutions for privacy protection from Japans experience.

Compilation of References .............................................................................................................. 388


About the Contributors ................................................................................................................... 429
Index ................................................................................................................................................ 436

xiv

Preface

Privacy, the right to be left alone, is a fundamental human right. Risks of the contraryprivacy invasionhave increased in significant proportions in a world increasingly turning online. In todays networked
world, a fast growing number of users are hopping on and off the Internet superhighways, multiple times
everydaymore so than they hop on and off physical expressways. Internet users are also doing more
diverse activities online, including browsing, shopping, communicating, chatting, gaming, and even
working. With so much online presence, users find themselves, in many situations, divulging information
that they would otherwise may not due to privacy concerns. Users may even be wary of getting online
because of fear of possible privacy invasion from the many preying eyes on the Internet. The issue is
not whether privacy should be protected or not, rather the issue is how it should be protected in the vast
online world where information can be intercepted, stolen, quickly transported, shared unknowingly to
the user, or even sold for profit. Compared to an offline environment, the Internet enables collection of
more information from users cost effectively, sometimes even without their consent. Thus, the Internet
poses greater privacy threat for users as their personal information is transmitted over the Internet if an
organization does not have a good security mechanism in place. Furthermore, the connectivity of the
Internet allows capturing, building, and linking of electronic profiles and behaviors of users.
Online privacy is a multidimensional concept and thus has been addressed in research from a multiplicity of angles, albeit not equally thoroughly. Much research effort has focused on addressing privacy
as a technological factor and hence proposed technical solutions to privacy protection. Although this is
an important dimension of online privacy, there are equally, if not more, important dimensions, such as
context, culture, perceptions, and legislation. Such softer (non-technological) aspects of privacy cannot be understood by only looking at the technological aspects of privacy. The human dimension is as
complex and as important for getting a more complete understanding of privacy. At a micro level, not
only that individuals have varying requirements for privacy, but the same individuals requirements
may change over time or between situational contexts. Response to privacy invasion may be very different between individuals and situational contexts. There may also be a gap between what individuals
desired and actual behaviors in relation to their privacy concerns. Individuals may have more stringent
privacy requirements than what their actual online practice reflects. Online privacy researchers offered
less coverage to these human factors, but understanding these factors, and many more, is key to gaining
a better understanding of online privacyhence the human relativism.
At a macro level, privacy requirements and response to privacy invasion may vary across cultures,
societies, and business situations. Organizational practices of privacy policies and responses to incidents of privacy invasion affect peoples perceptions on the current state of privacy, and consequently
affect their trust in the organization. People are generally concerned about how personal information is
collected, used, and distributed beyond its original purpose and beyond the parties originally involved.
Breaches to how their information is collected, used, or shared and response to such breaches directly

xv

impact their privacy concerns and their trust. There is still not sufficient empirical evidence to answer
many privacy questions at these macro levels, and many human aspects of online privacy in some social
and cultural settings have not yet received enough research attention. Consequently, our understanding
of the relationships between online privacy and dimensions such as culture, user characteristics, business
context, technology use, and education is still limited.
The world is increasingly turning online and there will be no reversal of this trend. To protect the
privacy of online users and to consequently achieve the full potential of transacting in an online world,
the issue of online privacy needs to be understood from multiple facets. The challenge is to minimize
constraints of online dealings without compromising users privacy. Such delicate balancing cannot
be achieved without a broad understanding of online privacy. This book is an attempt to provide such
understanding by offering a comprehensive and balanced coverage of the various dimensions of online
privacy. Many previously published books either treat privacy as a sub-topic under a broader topic of
end-user computing or information systems or focus primarily on technical issues or managerial strategies. Many others focus on end users and offer only introductory material or general guidelines to
enhance personal online security and privacy. While this treatment of these important topics of privacy
is appropriate for their intended use and audience, it does not allow for a broader and a more extensive
examination of online privacy and how it guides practice.
Furthermore, many gaps in privacy, threats, and fraud theories have not yet been filled. The most
prominent such gaps include linking privacy theories to other established theories and frameworks in
information technology or related disciplines. For example, culture, social, and behavioral issues in
privacy have not received enough attention. Research on human aspects as well as empirical assessments of privacy issues are lacking. Research on linking privacy considerations to business practices,
educational curriculum development/assessment, and legislative impacts are also scarce. Many studies
have focused on technological advancements, such as security protection and cryptography to offer
technical tools for privacy protection and for assessing risks of privacy invasion. Although such focus
is a must to protect users from these risks, technology is not equivalent to protection. For a protection
scheme to work well, both technical and human aspects have to work in harmony. A major goal of this
book is to provide a view of privacy that integrates the technical, human, cultural, and legal aspects of
online privacy protection as well as risks and threats to privacy invasion.
The book aims for (a) promoting research and practice in various areas of online privacy, threats assessment, and privacy invasion prevention, (b) offering a better understanding on human issues in these
areas, and (c) furthering the development of online privacy education and legislation. The book goes
beyond introductory coverage and includes contemporary research on the various dimensions of online
privacy. It aims to be a reference for professionals, academics, researchers, and practitioners interested
in online privacy protection, threats, and prevention mechanisms. The book is the result of research
efforts from content experts, and thus it is an essential reference for graduate courses and professional
seminars.
There are 19 great chapters in the book, grouped into five sections: (1) background, (2) frameworks
and models, (3) empirical assessments, (4) consumer privacy in business, and (5) policies, techniques,
and laws for protection.
The background section provides an overview of privacy for those who prefer a short introduction
to the subject. In Chapter I, Pauxtis and White point out the serious privacy implications of online
searches. Search engines can log and stamp each search made by end-users and use that collected data
for an assortment of business advantages. In a world where technology gives users many conveniences,
one must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
Nevertheless, end-users will always use search engines. They will always Google something on their

xvi

mind. The authors conclude that while the vast majority of casual Internet users either do not know
Googles data collection policies, or simply do not care, at the end of the day it comes down to the
simple fact that we as a society must put our trust into the technological innovations that have become
commonplace conveniences.
In Chapter II, Angelina and Tarn brought to the forefront the importance of legal protection and
privacy awareness and presented a taxonomic view to explore the relationship of the issues, legal protections, and the remedies and risks for not complying with the legal requirements. The authors used two
survey studies to reinforce the vital need for a stronger role by the government and business community
as well as the privacy awareness from online consumers themselves. The chapter is concluded with a vital
call for consumer privacy education and awareness, government and legislators attention, and timely
responses with legislation that protects consumers against those who would misuse the technology.
In Chapter III, Sockel and Falk highlighted the gravity of vulnerabilities to privacy in that it is not
uncommon for employees to work offsite, at home, or out of a hotel room, often using less than secure
Internet connectionsdial-up, cable, Internet cafs, libraries, and wireless. The chapter highlights the
relationship between vulnerability, threats, and action in what the authors termed risk triangle. It
delves into techniques that are commonly used to thwart attacks and protect individuals privacy, and
discussed how in the age of unrest and terrorism, privacy has grown even more important, as freedoms
are compromised for security. The chapter provides an overview of the various vulnerabilities, threats,
and actions to ameliorate them.
Section II consists of four chapters that offer frameworks or models to study various privacy issues.
In Chapter IV, Jansen, Peen, and Jensen turn the attention to the claim that Most of the current work
has focused on technical solutions to anonymous communications and pseudonymous interactions, but,
in reality, the majority of privacy violations involve careless management of government IT-systems,
inadequate procedures or insecure data storage. The authors introduced a privacy assessment model,
called the Operational Privacy Assessment Model that includes organizational, operational, and technical
factors. The factors can be evaluated in a simple scale so that not only the resulting graphical depiction
can be easily created for an IT system, but graphical comparisons across multiple IT systems are also
possible. Although their method has been developed in the context of government IT-systems in Europe, they believe that it may also apply to other government systems, non-governmental organisations
(NGOs), and large private companies.
In Chapter V, Lilien and Bhargava underline the strong relationship between privacy and trust. The
authors contend that the role of trust and privacy is as fundamental in computing environments as it is
in social systems. The chapter presents this role in online interactions, emphasizing the close relationship between trust and privacy, and shows how ones degree of privacy can be traded for a gain in the
level of trust perceived by ones interaction partner. The chapter explores in detail the mechanisms of
this core theme of trading privacy for trust. It also presents different trust models, the interplay between
trust and privacy, and the metrics for these two related concepts.
In Chapter VI, Ha, Coghill, and Maharaj offer an Australian perspective on measures to protect econsumers privacy, the current state of e-consumer privacy protection, and discuss policy implications
for the protection of e-consumers privacy. The authors suggest that although privacy protection measures
in the form of legislation, guidelines, and codes of practice are available, their effectiveness is limited in
alleviating consumers privacy and security concerns. The authors contend that protection of consumers
personal information also depends on how e-retailers exercise their corporate social responsibility to
provide protection to e-consumers.
In Chapter VII, Gurung and Jain review the existing literature and analyze the existing online privacy theories, frameworks, and models to understand the variables that are used in the context of online

xvii

privacy protection. The authors developed an integrative framework to encapsulate the antecedents to
online privacy protection behavior.
Section III includes research studies that report empirical findings on various privacy topics. One
main reason that online users are wary of providing personal information is because they lack trust in
e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. In Chapter VIII, Rea and Chen report
survey results of how the two genders differ in their ways to control their private data on the Internet.
Findings provide guidelines for e-businesses to adjust their privacy policies and practices to increase
information and transactional exchanges.
Discussion on privacy is incomplete without a glimpse into hackers and crackersthe elite corps
of computer designers and programmers, according to Schell and Holt in Chapter IX. Schell and Holt
argue that it is vital that researchers understand the psychological and behavioral composition of network attackers and the social dynamics that they operate within. This understanding can improve our
knowledge of cyber intruders and aid in the development of effective techniques and best practices
to stop them in their tracks. Such techniques can minimize damage to consumer confidence, privacy,
and security in e-commerce Web sites and general information-sharing within and across organizations.
The authors discuss known demographic and behavioral profiles of hackers and crackers, psychological
myths, and truths about those in the computer underground, and how present strategies for dealing with
online privacy, security, and trust issues need to be improved.
In Chapter X, Hsu adds a perspective from communications to the ongoing debate on online privacy.
She examines why online privacy researchers failed to explain why users asserting to have higher privacy
concerns still disclose sensitive information. The author argues that this is due to ignoring the social
context (what the author terms situational paradigm) in the research on online privacy. The author tries to
offer more support for the argument of the situational paradigm from the newly-emerging phenomenon
of online photo album Web sites in Taiwan.
Section IV focuses on consumer privacy in business and consists of four chapters. In Chapter XI,
Chan, Collins, and Movafaghi tackle the issue of online consumer privacy and digital rights management
(DRM) systems of protecting digitally stored content. This protection may be accomplished through
different strategies or combinations of strategies including: identifying authorized users, identifying
genuine content, verifying proof of ownership and purchase, uniquely identifying each copy of the
content, preventing content copying, tracking content usage and distribution, and hiding content from
unauthorized users. The authors argue that DRM systems may change the business model from a traditional buy-and-own to a pay-per-use, but caution that this may pose great risks to consumers and society
as DRM technologies may weaken the rights to privacy, fair use, and threaten the freedom of expression.
The chapter discusses the conflict between the rights of content owners and the privacy rights of content
users, and explores several DRM techniques and how their use could affect consumer privacy.
In Chapter XII, Parker offers views on online privacy from a marketing perspective in the context
of consumer marketing. The chapter provides insights into the ways that online privacy has become a
balancing act in which the needs of businesses are oftentimes balanced against the needs of consumers.
A number of privacy issues that affect the marketing of products and services are presented, along with
recommended best practices. The issues discussed include: (1) consumer, marketer, and government
perspectives on data collection, ownership and dissemination; (2) online advertising and the use of
cookies and spyware; (3) word-of-mouth marketing and the use of blogs, sponsored chat, and bulletin
boards; (4) marketing online to children; and (5) privacy issues in social networks and online communities. The chapter represents one of the first analyses of online marketing practices and their associated
privacy issues.

xviii

In Chapter XIII, Li and Zhang offer analysis of online privacy policies of Fortune 100 companies
within the context of the four principles (notice, choice, access, and security) of fair information practices. The authors found that 94% of the surveyed companies posted an online privacy policy and 82%
of them collect personal information from consumers. The majority of the companies only partially
follow the four principles of fair information practices. In particular, organizations fall short in security
requirementsonly 19% mention that they have taken steps to provide security for information both
during transmission and after their sites have received the information. The authors conclude that a well
designed privacy policy by itself is not adequate to guarantee privacy protection, effective implementation is as important. Consumer education and awareness are also essential for privacy protection.
In Chapter XIV, Chiou, Chen, and Bisset focus attention on the important question of online privacy
across cultures by analyzing cultural perceptions on privacy in the United States, Vietnam, Indonesia,
and Taiwan. The authors point out clear differences between how personal information is viewed in the
United States and Asia. For example, an American in Taiwan might feel suspicious if asked to provide
his passport number by a community Web site, while a Taiwanese in the United States might be puzzled
and alienated by the fierceness at which people guard their private lives. The authors argue that such
differences should be considered in cross-culture online privacy research and legislation. Furthermore,
due to the various cultural differences and backgrounds that form privacy perceptions, great care and
sensitivity should be taken into consideration when conducting privacy studies across cultures.
Section IV deals with policies, techniques, and laws for privacy protection. In Chapter XV, Lancaster and Yen focus on the important linkage between biometric controls and privacy. Biometrics is an
application of technology to authenticate users identities through the measurement of physiological or
behavioral patterns, and thus do not suffer from the shortcoming of external authentication techniques that
rely on items that can be lost, forgotten, stolen, or duplicated. The authors conclude that, with adequate
communication, users are likely to appreciate systems that allow them the ease of use and convenience
that biometric systems offer, and hence their use will continue to grow in the future.
In Chapter XVI, Erickson discusses the important issue of the tension between openness in government and personal privacy. The trend in the federal legislature has been to continually strengthen the
FOIA and openness by reaffirming a presumption that government records should be released unless
there is a compelling reason not to. Alternatively, the trend in agency practice and the courts has been
toward more privacy, allowing use of certain exemptions in the FOIA to deny records to individuals or
organizations seeking them. This balance has been clarified somewhat by legislation on electronic records,
agency practice, and a number of court cases suggesting agencies can limit releases to central purpose
activities and records not including individually identifiable information. The author also considers the
status and vulnerability of confidential business information passed on to governments and the status
and vulnerability of government databases concerning individual citizens. The main conclusion of the
chapter is that matters remain in flux in the legal aspects of privacy, and regardless of which way the
balance tips (openness vs. privacy), more certainty will help government, organizations, and individuals
better plan how and when to share their own information resources.
In Chapter XVII, OMahony and Flaherty discuss the legal framework for consumer and data protection in Europe which seeks to secure the protection of consumers while simultaneously facilitating
economic growth in the European Union. The chapter outlines the main sources of law which protect
consumers and their privacy, the important provisions in these sources of law and critically analyzes
them, and points the gaps and deficiencies in the consumer and data protection legal structures. The
authors argue that the creation of these legal rights and legal protections will only stem the misuse of
personal data if people know about the law and their rights and know how to access legal protections.
Thus, more needs to be done to ensure that citizens of the European Union are equipped with the nec-

xix

essary knowledge to ensure that their personal data is treated with respect and in accordance with law.
The authors conclude that more focus needs to be put on ensuring greater compliance with the law,
particularly from businesses who have benefited from the free flow of data.
In Chapter XVIII, Mika and Tyler provide an overview of the law relating to cybermedicine and
telemedicine in terms of data protection and other legal complications related to licensing and a conflict of
state laws. The authors examine the laws applicable to Web sites where medical diagnosis or the purchase
of medical services (including prescriptions) is available. They discuss how the new methodology of
acquiring medical care is at odds with traditional notions of state regulation and how current laws, both
federal and state, leave many gaps related to any consumer protections or potential causes of action when
privacy is compromised. The authors posit some expert advice for consumers regarding using websites
for medical purposes as well as protecting their own privacy. Lastly, the authors advocate a federal law
more punitive that HIPAA; one that regulates and protects patient information, medical transactions,
and interactions on the Internet and deters violations of patient privacy by mandating significant fines
and imprisonment for negligent or criminal and willful violations of that privacy.
In Chapter XIX, Tarn and Hamamoto emphasized trans-border differences in the concepts of privacy; namely, that the concept of privacy in Japan is different than that in the western countries. They
explained how, after more and more privacy-related problems were revealed by the media, consumers
began to pay attention to the protection of their private information, and, in response, the Japanese government enacted legislation to protect consumers and regulate companies business activities associated
with customers private information. This exposed many weaknesses in companies privacy protection
systems and revealed unethical uses of private data.
We cannot claim perfection of this book on online privacy, a broad and multidimensional concept.
Nevertheless, we believe it fills a major gap in the coverage of privacy by providing a comprehensive
treatment of the topic. Thus, it provides a single integrated source of information on a multitude of privacy dimensions including technical, human, cultural, personal, and legal aspects. Research on privacy
is still evolving and a varied and broad coverage as presented in this book is a valuable reference for
researchers, practitioners, professionals, and students.

xx

Acknowledgment

We are grateful to numerous individuals whose assistance and contributions to the development of this
scholarly book either made this book possible or helped to make it better.
First, we would like to thank all chapter reviewers for their invaluable comments, which helped
ensure the intellectual value of this book. We would also like to express gratitude to our chapter authors
for their excellent contributions to this book.
Special thanks are due to the publishing team at IGI Global, in particular to our Managing Development Editor, Ms. Kristin Roth, who allowed her staff to provide invaluable support to keep the project on
schedule and in high quality, and to Dr. Mehdi Khosrow-Pour whose vision motivated the development
of this pioneering project. This project would not have been successful without Ross Miller, Deborah
Yahnke, and Rebecca Beistline, who tirelessly offered their professional assistance during the development of this project.
Finally, we would like to give our heart-felt thanks to Kuanchins wife, Jiajiun, and Adams family
for their understanding and encouragement during the development of this book.
Kuanchin Chen and Adam Fadlalla

Section I

Background

Chapter I

Google:

Technological Convenience vs.


Technological Intrusion
Andrew Pauxtis
Quinnipiac University, USA
Bruce White
Quinnipiac University, USA

AbstrAct
What began as simple homepages that listed favorite Web sites in the early 1990s have grown into some
of the most sophisticated, enormous collections of searchable, organized data in history. These Web
sites are search enginesthe golden gateways to the Internetand they are used by virtually everyone.
Search engines, particularly Google, log and stamp each and every search made by end-users and use
that collected data for their own purposes. The data is used for an assortment of business advantages,
some which the general population is not privy too, and most of which the casual end-user is typically
unfamiliar with. In a world where technology gives users many conveniences, one must weigh the benefits of those conveniences against the potential intrusions of personal privacy. Googles main stream of
revenue is their content-targeted AdWords program. AdWordswhile not a direct instance of personal
privacy breachmarks a growing trend in invading personal space in order to deliver personalized
content. Gmail, Googles free Web-based e-mail service, marked a new evolution in these procedures,
scanning personal e-mail messages to deliver targeted advertisements. Google has an appetite for data,
and their hundreds of millions of users deliver that every week. With their eyes on moving into radio,
television, print, establishing an Internet service provider, furthering yet the technology of AdWords, as
well as creating and furthering technology in many other ventures, one must back up and examine the
potential privacy and intrusion risks associated with the technological conveniences being provided.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Google

IntroductIon: the World of


seArch engInes
Now more then ever, the casual consumer is letting
their guard down on the Internet because of the
level of comfort gained over the past decade. The
Internet has become a norm of society and a staple
of culture. Many end-users accept the potential
risks of unveiling their credit card number online,
even at the most respected of retailers. While
having a credit card number compromised could
certainly cause a headache, the future of privacy
on the Internet does not have much to do with
those 16 magic digits. Instead, privacy, or lack
thereof, on the Internet has to do with something
all Internet users employ in their daily lives: the
search engine.
Privacy and general consumer protection on
the Internet is no longer exclusively limited to the
safeguarding of personal financial information
such as credit card numbers and bank accounts.
Other personal information is being given out
each and every day simply by using any major
search engine. Google, for instance, logs much
of what their users search for and then use that
information to their advantage. With hundreds
of millions of logged searches each day, a search
engine like Google can analyze everything from
cultural and economic trends right on down to
what a given user is thinking or feeling based on
their search queries. This collection of information is a smoking stockpile of marketing data that
can then be utilized to build or better render other
personalized, content-targeted services.
Search engines provide the enormous service
of indexing billions of pages of data so that the
end-user can mine for a given query. To end-users,
this indexing and search service is the ultimate
convenience put out by the major search engine
companies. It allows us to locate documents,
images, videos, and more among billions of Web
pages in a matter of milliseconds. An Internet
without search engines would be an unorganized,
uncharted, unmeasured wilderness of Web pages.

Rather than having to shuffle through a floor full


of crumpled up, torn notebook pages, search engines put everything into finely labeled, organized
notebooksan invaluable service no end-user
would ever sacrifice.
Web sites are typically archived, or indexed,
using advanced Web crawling bots or spiders that run off of servers and seek out new
Web pages or recently updated pages. A search
engines business is built entirely on the practice
of collecting dataas much of it as possible.
Search engines began as simple, small listings
of useful Web sites in the 1990s. One decade
later, these simple listings have turned into one
the most phenomenal collections of organized
data in history. Google, for instance, claims to
have over 10 billion pages of content indexed in
their search engine, with millions of more pages
being added each day.
It is because of the search engines easy access
to requested information that they have become
second-nature to Web users. People flock to
search engines without thinking twice. Google
has become a part of everyday society and a verb
in modern linguistics. When someone needs to
find something online, they simply Google it.
End-users enter names, addresses, phone numbers,
interests, health ailments, questions, fantasies,
and virtually anything imaginable into search
boxes. Every search is logged and saved. Every
user has a search engine fingerprint trail. The
data that search engines such as Google amount
from logging search queries is astronomical, and
the uses for such data are endless.
The value alone for such precise data to be sold
to advertisers is priceless. Imagine an advertiser
who obtained the search datain its entiretythat
Google has. They could immediately reconfigure
their marketing efforts with pinpoint precision.
Googles data reservoir is the Holy Bible of the
marketing universe. All the same, one could call
these piles of data some of the most dangerous
weapons in the world: identifying, damming,
incriminating search queries are logged by the

Google

millions each and every day. Nevertheless, endusers will always use search engines. End-users
will always Google something on their mind.
Search engines are quick, convenient, and always
yield a precise or near-precise result. Who would
want to give that up?

gAteWAys to the Internet


Google is no longer just a search engineit is a
portal to the Internet. Over the past several years
an architecture shift within the search engine
world has been occurring. Google, Yahoo!, and
an army of others, are providing much more than
just search results. Items like free e-mail and
Web site hosting have always been among the
traditional extensions of a search engine, but in
more recent years the bigger search engines have
launched impressive additions to their communities. Google, for instance, has recently launched a
suite of Web applications which rivals Microsoft
Office, complete with word processing and spreadsheets, all right on your Web browserpair that
with Google Maps, Google Talk, Google Calendar,
Google News, Google Checkout, Google Notebook, Google Groups, YouTube, Google Earth,
and Google Desktop, and one could argue never
having to leave the Google domain for all your
Internet needs! After all, the bigger the audience
they can attractand provide a technological
convenience or solution tothe better.
Why, exactly, does Google go through the
hassle of launching new services? It is simple:
search engines have the ultimate goal of finding
new ways to provide conveniences to the enduser. The Google services listedalong with the
dozens of others provided by Googleare just
that: conveniences. The more services Google has
to attract and retain their viewers the better off
they will be. Search engines live and die by the
amount of traffic that goes through their massive
networks each day. This traffic is their audience,
to which advertisements can be displayed, much

like commercials on television. Positioned on


the top and sides of each and every traditional
search engine result page, or SERP, are blocks
of advertisements which directly correlate to the
end-users search query.
On Google, these aforementioned blocks of
advertisements are called AdWords. AdWords
is one of the main support beams of the Google
business infrastructure, and is the companys
main stream of revenue. It allows end-users, from
individuals to companies to international organizations, to purchase advertising space on the
SERPs as well as other areas of Googles massive
advertising network. Prices for an advertising
campaign are determined solely on who searches
for what queries, and how frequently a given
query is searched for. Google is not alone in this
advertising architectureYahoo! runs their own
program, Yahoo! Search Marketing and MSN
runs Adcenter. Google AdWords, however, has
set the bar for this type of pay-per-click (PPC)
advertising, with Yahoo! and MSN usually following suit with any major changes that Google
makes to their program. AdWordswhile not a
direct instance of personal privacy breachmarks
a growing trend in invading personal space in
order to deliver personalized content.
One must also see things as Google does. This
is their company. When an end-user enters one
of Googles many Web sites, we are in essence
walking though the door of their store. Much like
any responsible store owner may observe his or
her patrons or inventory logs to see what sells
and what does not, Google is ultimately doing the
same, but on much larger scale. Google is free of
charge, and if they are not earning money through
some sort of subscription-based other e-commerce
model, then displaying advertisements is the only
sustainableand sensibleway to go. Google
CEO, Eric Schmidt, on what it takes to run and
improve the advertisements on Google:
More computers, basically, and better algorithms.
And more information about you. The more per-

Google

sonal information youre willing to give usand


you have to choose to give it to usthe more we
can target. The standard example is: When you say
hot dog, are you referring to the food, or is your
dog hot? So the more personalized the information, the better the targeting. We also have done
extensive engineering work with Google Analytics to understand why people click on ads. That
way we can actually look at the purchase and go
back and see what buyers did to get there. That is
the holy grail in advertising, because advertisers
dont advertise just to advertise, they advertise to
sell something. Google CEO Eric Schmidt. (Fred
Vogelstein, Wired Magazine, 4/9/2007)

the fIne lIne betWeen prIvAcy


And publIc domAIn
While Google may certainly be well within its
rights to collect some degree of data from their
users, one needs to determine where public domain
ends and personal privacy begins. In order for a
degree of privacy to exist on the World Wide Web,
one first must chart its boundaries. Definitions of
Internet privacy come on many different levels
and in many different flavors. Ali Salehnia, author
of Ethical Issues of Information Systems (2002),
defines Internet privacy as: the seclusion and
freedom from unauthorized intrusion. Salehnia
goes on to emphasize the word unauthorized,
explaining that the casual end-user understands
that his or her personal information is constantly
being collected by data centers around the world
to some degree. The point of unauthorized intrusion occurs as soon as the end-user is no longer
aware that this data collection is happening.
Unauthorized intrusion comes on several levels. On the highest level, an unauthorized intrusion of personal information will occur when the
perpetrator directly accesses a computer, stealing
as much or as little information as possible. This
could be achieved in a variety of ways, from the
perpetrator physically breaching personal data

by sitting down and simply using a computer, or


by slipping onto a system due to a lack of network security or other vulnerabilities. But most
privacy breaches are a lot more subtle than that,
and occur much more than Internet users would
like to think.
Putting Salehnias concept in a real-world
example, a user who enters their credit card on
Amazon.com typically understands that they are
giving that merchant quite a bit of information
about themselvesmuch the same as if that user
were to walk into a Macys and make a similar
purchase. On the other end of the spectrum, a
casual end-user who utilizes Google several times
a day may come to find out three years later that
every search they have ever made was logged,
time-stamped, and potentially used to Googles
own business benefit. The latter, of course, is
an illustration unauthorized intrusion on the
Internetand it is happening millions of times
a day.

A brIef hIstory of
contemporAry seArch
engInes
End users were not always as comfortable with the
Internet as they are today. Shopping on Amazon.
com, running a Google search, and online banking
are second-nature to most Americansbut back
in the 1990s, they were not. In a 1992 Equifax
study, 79% of Americans noted that they were
concerned about their personal privacy on the
Internet. Moreover, 55% of Americans felt that
privacy and the security of personal information
would get worse in the new millennium (Salehnia,
2002). A March 1999 Federal Trade Commission
(FTC) survey of 361 Web sites revealed that 92.8%
of the sites were collecting at least one type of
identifying information explains Salehnia.
Examining this statement more closely, nine out of
every ten Web sites not only detected but stored a
visitors name, address, IP, or something else that

Google

is both unique and identifying. This was 7 years


ago. Technology has become much more precise
and invasive, under the rationalization that some
of these intrusions are necessary for convenience
and the advancement of technology.
It was in the mid 1990s when several of the
worlds most well-known search engines began
appearing online. Yahoo! was one of the first big
search engines. Debuting in 1995, Yahoo! was
the result of Stanford University students David
Filo and Jerry Yang creating a simple homepage
that listed their own favorite Web sites. Next to
each URL, they included the sites full name
and a short description. As they began manually
indexing more sites, Yahoo! blasted off. It gained
a large fan base, which in turn led to major funding. Also making their debut in the mid 1990s
were Excite, Lycos, and AltaVistaall three of
which are still around presently.
In 1996, two other Stanford University graduate students, Larry Page and Sergey Brin, launched
BackRub. It was a system that organized Web
sites using a proprietary algorithm designed by
Page and Brin. This algorithm measured how
many Web sites link to a given Web site. The
more backlinks equals a higher page quality;
the higher the quality equals how high up on
the SERPs a given page appears. Page and Brin
opened shop in a garage, turned BackRub into
Googol several months later, and ultimately
ended up with Google thanks to a misspelling on
their first check from a venture capitalist. Google
is presently one of the most viewed, utilized Web
site portals on the Internet, and is growing in
virtually every way every single day. It is the big
brother of the World Wide Web, and any change it
makes effects virtually any Web site in existence.
Google grows by the day, but not only in terms
of features and pages indexed. Googles data
reservoir from its users searches is one of the
largest collections of data in history.

survIvAl of the fIttest


Companiesand Web sites in generalneed
search engines to survive. Search engines cannot be ignored or avoided. Professional Web sites
are constructed in such ways so that they are
search engine optimized (SEO), thereby improving their chances of a timely and deep indexing
by the popular search engines. Typical search
engine optimization trends are defined by the
major search engines. Google, for instance, has
largely done away with scanning HTML Meta
tags, which used to define Web site keywords
and details right in the HTML code, in favor of
scanning a given Web site for concise keywords,
themes, design style, inbound and outbound
links, even a visible privacy policy. In the age of
information, when Google hiccups, every major
Web developer hears it.
Search engines bring targeted traffic, which in
turn convert to sales, which creates a client base.
Many Web sites would flounder in the darkness
of cyberspace without search engines. Lawrence
M. Hinman (2006), director of the Values Institute at the University of San Diego explains the
philosophy of the search engine best: Search
engines have become, in effect, the gatekeepers
of knowledge. If you dont show up in Google,
you dont exist indeed, if you dont show up on
the first page of Google, you hardly exist.
Because of the reliance on search engineson
both personal and business levelsInternet users
will not ever stray away from using them. Alexa.
com, one of the Webs most respected traffic pattern analysis companies, reports that the average
Internet user visits google.com seven times a day.
According to Alexa, google.com receives 75-100
million hits a day, and that is not to mention any
visits to their foreign mirrors or other services.
Search engines are here to stay, and a level of
vigilance must be taken as to just how big they
become and how they utilize the sheer amounts
of data they collect.

Google

WhAts the hArm?


What is the harm is having Google log 150 million searches a day? In Elinor Mills CNET news
article Google Balances Privacy, Reach (2005),
she explains some of the extreme, but plausible,
scenarios that could arise: The fear, of course,
is that hackers, zealous government investigators,
or even a Google insider who falls short of the
companys ethics standards could abuse [this]
information. Google, some worry, is amassing
a tempting record of personal information, and
the onus is on the Mountain View, California,
company to keep that information under wraps.
Furthermore, there is no way a user can delete their
data from Googles records, nor is there any sort
of clear explanation what the information could
potentially be used for in the future.
According to GoogleWatch, a non-profit organization that has as one of its goal the spreading
of awareness of search engine privacy, there are
several important main points in regards to the
potential privacy exploits and other harm Google
could wreak. The first point it brings up is Googles
mysterious 30 year cookie. Google places a cookie
with an expiration date of January 17, 2038, 2:14:05
PM on the hard drive of any user that makes a
search or visits a website containing Adsense
advertisements (Googles publisher network).
The cookie contains a uniquely identifying serial
number which can be used to correlate with other
searches. In essence, it is another IP addressa
second layer of identification Google can use
to string together a users search queries or site
viewing trends.
The second breach of privacy, GoogleWatch
argues, is the increasing reliance on IP addresses.
As seen in the server log, Google includes a
users IP address among other things in each
stored search query. IP addresses reveal much
more than a number, but a location as well. This
is called IP delivery based on geolocation and
can serve several purposes. For one, when a user
searches Google, the engine knows your country,

state, and town. Advertisements are then tailored


to these settings.
Finally, GoogleWatch alludes to some of
Googles side projects. A notable project is the
Google Toolbar which was released in March 2002,
and continues to stir a great deal of controversy in
regards to what Google captures and logs. Google
captures a users IP and sends a cookie when their
search engine is used or when the user lands on
a page with Adsense ads. The Google Toolbar,
on the other hand, logs, records, and beams back
to Google every site a user visits, regardless of
whether or not that site is in any way affiliated
with Google.
The Google Toolbar is just another example
in modern conveniences. The convenience of
the end-user having the most powerful search
tool in the world in front of them at all times
outweighs the possible privacy invasions it may
pose. According to PC Magazine writer Cade
Metz, One of the advanced Toolbar features is
a service called PageRank. With this activated,
when you visit a Web site, a small PageRank icon
in the toolbar gives you a rough indication of
how popular the site is. If the site is particularly
popular, youll see a long green line. If not, youll
see a short green line. When this service is turned
on, Google keeps a complete record of every
Web site you visit. The Google Toolbar comes
set with the page-tracking feature set to on. The
Google Toolbar privacy policy explains that by
knowing which web page you are viewing, the
PageRank feature of Google Toolbar can show you
Googles ranking of that web page. PageRank is
part of the search algorithm patent developed by
Sergey Brim and Larry Page that decides which
pages appear higher than others in the SERPs.
Many users simply leave the feature turned on,
while those in Web-production or similar fields
choose to leave it on as a convenient resource.
This resource allows Web developers and Web
aficionados to view in live-time which Web sites
have high PageRank, and which do not. It is an
invaluable resource to those in the field. The price

Google

paid, of course, is privacy. As a side note, out of


the billions of Web pages out there, not many rank
as the coveted PageRank 10. A few examples are:
Google.com, Adobe.com, NASA.gov, Real.com,
MIT.edu, and NSF.gov (November 2006).
Another clash of convenience and privacy
within the Google realm comes under their
Google Earth application. Googles popular
driving directions and maps Web site launched
one of the companys most controversial and
privacy-threatening features to date in June of
2007. Dubbed Street View, it is a feature that
allows end-users to view 360 degree panoramic
and zoom-able images from street level. While
the images are not live, they still captured people
in action during a certain point in time. Some of
the images are clear enough to easily identity individuals. Google was bombarded with complaints
by privacy advocates, and has since instituted
a policy of blacking out images of people upon
request. The Street View feature took loss of
privacy on the World Wide Web to a new, much
more tangible level.

seArch engIne loggIng


Google certainly was not the first search engine,
but it is the one that has made the largest impact
on Internet. Any change made by Google is felt
not only by the Webmasters of the Internet, but
by virtually the entire World Wide Web. Googles
reach is far and strong. Google engineers are
constantly improving indexing technologies, to
make search results more accurate. This in turn
brings in more users, which brings more traffic, which equals more data to be collected and
utilized. The more traffic, of course, also means
a higher flow of clicks on Googles advertising
network. The more clicks, the higher the likelihood of Google selling a user their advertiser
products via AdWords.
Google tracks it all: every search, every query,
right down to each and every letter and punctua-

tion mark. Google stores upwards of 150 million


queries a day from all over the world. What exactly
does this data look like? In its most primitive form,
a search for colleges would yield the following
piece of data in Googles logs:
...89 - /Nov/00 0::9 - http://www.google.
com/search?q=colleges - Firefox .0.; Windows XP SP
08kd0e9

Clearly marked in this simple string of information is the end-users Internet protocol address,
date and time of access, the URL complete
with keywords (search?q=colleges), browser and
browser version, operating system, and unique
user-identifying cookie. Google generates and
stores over one billion of these strings a week,
which equals approximately fifty billion a year. If
Google has not deleted one of these queries since
they began logging, then one could estimate that
they currently store upwards of half a trillion of
these strings within their servers.
Google provides a rather by-the-book explanation in regards to the reason why they log every
search. In a CNET news article by Elinor Mills,
she asked Google executive Nicole Wong the need
for such a collection of data. The answer: Google
uses the log information to analyze traffic in order
to prevent people from rigging search results, for
blocking denial-of-service attacks and to improve
search services. This data is then stored on many
servers, for an unknown period of time. According
to David F. Carr of Baseline Magazine, Google is
believed have anywhere from 150,000 to 450,000
servers around the worldwith the majority being
in the United States.
Googles privacy policy, which clearly outlines
all of their data collection practices, is a mere two
clicks away from any site on their entire network.
Their search engine policy is divided up into nine
clear, simple, easy to understand sections that
span no more than two screens and clocks in at
approximately 980 words. The policy makes it
clear that no personally identifiable information

Google

is ever released without court warrant. However,


anything typed into a search box is certainly
free-game and very much becomes the property
of Google.
Googles privacy policy clearly states that
Google may share information about you with
advertisers, business partners, sponsors, and other
third parties. According to their own policy,
Google shares statistical data regarding search
trends with groups of advertisers and affiliates.
With information like this, an advertiser could
potentially know how to best optimize their own
Web sites to increase traffic from Google or to
appear significantly higher in the SERPs. Obviously, the more traffic a given site receives the
more probability of a conversion, sale or lead.

google knoWs It All


Users type in what they need to find. Typically,
this has much to do with what they are thinking,
planning on doing, or is just something that is a
part of their life. For instance, Google may log IP
123.45.67.89 as having 17 out of 20 searches on a
given day having to do with high blood pressure
and treatments. Clearly, Google now knows that
IP 123.45.67.89 has a user who either has this ailment or is researching it on behalf of a friend or
relative that does. Kevin Bankston, staff attorney
for the Electronic Frontier Foundation, one of the
Internets biggest privacy advocates explains that
data like this is practically a printout of whats
going on in your brain: What you are thinking of
buying, who you talk to, what you talk about.
Taking the example about high blood pressure
one step further, Google now is able to potentially
make money from user 123.45.67.89. First, Google
will make money through their AdWords program.
Google uses the search term to show advertisements that are as close to the search query as
possible on the top and right sides of each SERP.
While in no means is this a direct breach of privacy,
the notion of scanning what an end-user enters to

deliver targeted content is beginning to evolve.


Take Gmail, for instance. Gmail does not scan a
search queryit scans private email messages.
The process is automated, of course. That is to
say there is not a Google employee sitting in a
cubicle somewhere reading personal e-mails of
Gmail users. Messages are scanned by a bot.
Gmail took content-targeted, personalized ad
delivery to a whole new levela level which
is simply the precursor for even more invasive
advertising yet to come.

AdWords
The value of the sheer amounts of data Google
has amassed is priceless. It is data that can prove
trends, display top ranking search queries, illustrate what region searches for what item
morethe list is long, but not as long as the list
of marketing groups or advertisers that would
do anything to have such data. Should the data
Google has amassed ever be released, the damage
could be devastating. In the summer of 2006, AOL
released the search queries of 528,000 of its users
that used AOLs generic search engine. While not
nearly as detrimental as could have been if it was
Google releasing data, the damage was already
done. People were openly identified by their search
termsseveral even being convicted of crimes
and sentenced to prison, as their search queries
either supported evidence of their wrongdoings,
or put the person in the spotlight and exposed
them for committing a crime.
Google uses much of their data to sculpt their
pride and joy: AdWords. AdWords accounts for
nearly $3.5 billion in yearly revenue, and is the
main stream of Googles income. Google utilizes
their users search queries to aggressively and
competitively price keywords. Advertisers then
bid on these keywords for positions on the top
and right side of the SERPs. The more relevant
the ad, the more likelihood a targeted searcher
will click on it. While this is hardly an invasion

Google

of end-user privacy, the concept of AdWords


utilizing search trends and scanning text given
by the end-user is the precursor to more invasive
and privacy-exploiting advertising practices.

gmAIl: AdWords evolve


Gmail, Googles free Web-based e-mail service,
gives users who sign up their own Google e-mail
and nearly three gigabytes (and counting) worth
of storage. It has a lightning-fast interface, and
some of the most advanced features ever seen in
Web-based e-mail. But all of the flashy features are
there for a reason: to once again drive people in.
Once more, Google is freely handing out convenience for the small price of highly-personalized,
content-targeted advertising.
This time around, Google is scanning e-mail
messages for certain keywords. For instance, if
Google sees the word ice cream in a private
e-mail, advertisements for ice cream will appear on the right side of their e-mail application.
According to the Gmail privacy policy: Google
maintains and processes your Gmail account and
its contents to provide the Gmail service to you
and to improve our services. The Gmail service
includes relevant advertising and related links
based on the IP address, content of messages and
other information related to your use of Gmail.
Elinor Mills of PC Magazine explains the other
unsettling fact about Gmailthat Google takes
its time to erase trashed messages from their
servers. According to Mills, Gmail users can
delete messages, but the process isnt intuitive.
Deletion takes multiple steps to accomplish and
it takes an undetermined period of time to delete
the messages from all the Google servers that may
have a copy of it
Since Gmail launched in 2004, their automated
scanning of private e-mail messages has stirred a
bit of controversy. The GoogleWatch organization

argues the possibility of Gmail compiling a database of what email addresses use which keywords.
What further complicates Gmails e-mail privacy
is the fact that people who do not even subscribe
to the Gmail service are having their e-mails
scanned as well. If a Gmail user receives an email from a Comcast e-mail address, the person
with the Comcast e-mail is having their message
scanned for advertising opportunities by Google.
The sender does not agree to Googles invasive
scanning, nor do they agree to have their e-mail
infiltrated with ads pertaining to what they wrote
in private. Gmail furthers the notion of invasive
and targeted advertising, and merely builds some
framework to the forms of advertising Google
may be capable of in the future.

the future of the seArch


engIne Industry
Predicting the future trends of search engines
is like predicting the future trends of any other
industry: its purely hypothetical. Search engines
are businesses. From time to time, like any other
business, they drop hints of their future plans.
Newspapers carry the latest headlines and trade
journals chart their latest strategic business moves.
The best we can do is to connect the dots using
this information. Things such as acquisitions,
however, may give the best glimpse into the future
of a particular search engine.
Take for instance, Googles acquisition of
YouTube. When Google acquired YouTube in
October of 2006 for $1.65 billion, it sent ripples
through the industry. Google, worth almost $170
billion in stock, swallowed YouTube whole. After
all, in the grand scheme of things, YouTube is to
media what Google is to information. Aside from
being one the largest technology acquisitions ever,
this showcased Googles might. But above all,

Google

this acquisition dropped a hint of where Google


is going.
From Googles Press Release:
The acquisition combines one of the largest and
fastest growing online video entertainment communities with Googles expertise in organizing information and creating new models for advertising
on the Internet. The combined companies will focus
on providing a better, more comprehensive experience for users interested in uploading, watching
and sharing videos, and will offer new opportunities for professional content owners to distribute
their work to reach a vast new audience.(source:
Google press release, October 9, 2006)
One of the main motivations behind Googles
acquisition of YouTube is laid out right in their
press release: creating new models for advertising
on the Internet. The past few years have given
birth to a new hybrid form of Internet-based advertising: viral marketing. Although viral marketing has been around for a decade, it was never a
household term or fully-utilized marketing method
until recently. Viral marketing encompasses any
advertising effort that quickly spreads (typically
over the Internet) through social networks and
becomes the talk of the town. Viral videos and
multimedia usually depict highly unique, comical,
or risqu materials that have exceedingly unusual
appeal to certain niche audiences. A successfully
executed viral campaign costs nothing, but can
create the same results as paid print or media
advertising.
What makes the opportunity of viral marketing
so enticing to Google? With millions of hosted
videos, and upwards of 70,000 new videos added
per day, YouTube controls a large portion of the
Webs video-viewing audience. Combined with
Googles analytical and marketing arms, the
Google/YouTube combo makes for a serious advertising podium. If a popular viral niche is found
by Google, they can ride that niche by inserting

0

their own advertising campaign or that of one of


their advertisers via AdWords. As viral videos
spread like the plague through social networks,
Google may be able to control what messages or
advertisements those videos hold.
Viral marketing aside, imagine for a moment
just running a basic search for a particular car
model. Instead of Google just returning the usual
search results and accompanying AdWords, the
search engine can now return streaming commercials targeted precisely for the make and model.
Similar to how logs of how many searches a
particular user makes, logs of which commercials
play more often can also be made and utilized in
a similar capacity. But above all, when an enduser watches one of these advertisements they
have actively sought it out, rather than passively
watching a random commercial on the TV. These
Internet searchers are more targeted consumers
than the passive coach-potato.
Google acquiring YouTube is only one of the
dozens of major ventures either recently completed
or in the works by major search engines. Search
engines are exploding their presence online as
well as off-line. The future of search engines may
be somewhat of a rather ironic thing. During the
dot com bubble of 1990s, businesses were doing
everything possible to get online and to expand
their presence there. Now, we see well-established online entities wanting to move off-line.
For the online powerhouses, the off-line world
is an uncharted wilderness of opportunity and
new audiences.
Take Google, who has paved the way for Internet advertising, now researching something
foreign to them but commonplace to traditional
marketers: radio and television. Google AdWords
makes it extremely simple for a marketing novice to launch an advertising campaign online.
Through their simple interface, all one must do
is create an ad, choose a target audience, and pay
using a credit card. Google uses its stockpile of
end-user search query data to run the advertisements as precisely as possible, which in turn gives
the advertiser sales or conversions.

Google

This degree of simplicity is something that


never existed for marketers wishing to easily
advertise on the TV or radio. To advertise on
one of these mediums, one must traditionally
go through the complicated steps of calling the
cable company, creating an advertisement with
a third-party media company, and working with
another third-party marketer to properly target
the ads. If Google can properly take the complexity out of running advertisements on the radio
and television for the average person (especially
in the age of Webcams and YouTube), it could
potentially reshape the landscape of the TV and
radio advertising industry.
Why does the future of search enginesespecially if their main focus is the off-line world
matter? Simply put: because the data they begin
collecting off-line will be migrated into the online
world. If search engines can one day record the
shows we watch, the songs we listen to, the advertisements we actively (as opposed to passively)
watch, there will be no stop in sight for the types
of overly-invasive advertising methods used.

the future of google


AdvertIsIng
Today, the vast majority of our revenue is in text
ads correlated with searches. In the last couple of
years, we have developed what are called display
ad products, including banner ads, video ads,
click-to-call ads, and things like that. And Ive
also said that we are pursuing the possibility of
television advertising. By that I mean traditional
television advertising. And we bought dMarc
Broadcasting to do radio ads.
So lets rank the probability of them being affected
by targeted ads. Theres search: Thats 100 percent
affected. What about radio? Is it possible to get a
targeted ad right to your car right now? Not yet
because we cant target the individual receiver
in your car. If two cars are next to each other,

the same radio station cannot have two different


ads. However, if its at a regional level we can do
it to the zip code level. So lets call that partial
targeting.
Now, lets look at television. Every one of the next
generation of cable set-top boxes is going to get
upgraded to an IP-addressable set-top box. So
all of a sudden, that set-top box is a computer
that we can talk to. We cant tell whether its the
daughter or the son or the husband or the wife in
a household. All we know is were just talking to
the television. But thats pretty targetable because
family buying patterns are pretty predictable, and
you can see what programs theyre watching. And
if youre watching a YouTube video, we know
youre watching that video.
My point of going through this little treatise is to
say, if the total available market is ($600 billion
to $800 billion, we wont be able to target all
$800 billion. It will not be a 100 percent perfectly
targetable, straight into your brain, but we should
be able to offer a material improvement (in response rates) to many businesses. Eric Schmidt,
Google CEO (Fred Vogelstein, Wired Magazine,
4/9/2007).
AdWords is not only migrating into other
services in Googles network, but it is evolving
technologically as well. During the summer of
2006, Google launched two new services which
further enhanced AdWords. First, they launched
their own payment processor dubbed Google
Checkout. Advertisers that elect to use Google
Checkout have a small shopping cart icon next
to their advertisement. Google Checkout brings
AdWords even closer to 1-click-purchasing. A
Google search engine user that makes a purchase
through Google Checkout surrenders their name,
address, and credit card number to Google to make
that purchase. While financial information is no
doubt safe, Google now has the ability to connect
a users searches to their buying accountif



Google

they elect too, of course. Also launched in the


summer of 2006 was AdWords Click-to-Talk.
With the click of button, a user may now talk to
a salesman or representative on the other side of
the advertisement.
A second advancement in Googles technology
came with the 2005 launch of Google Desktop,
and has only been seeing stable releases as of late
2006. Google Desktop took Googles indexing of
files one step further. Like the Google Toolbar,
this service was a downloadable software package. Essentially, Google Desktop records a users
e-mail, photos, music, and all other files so that
they are easily searchable. Once more, Google
drew in a fan-base for this product, as the search
capability they engineered has been proven far
more powerful than the built-in Windows search.
Google Desktop lets Google have full-access to a
computer and other computers that system may
be networked with unless a particular setting in
the program configuration is set off.
In February of 2006, the Electronic Frontier
Foundation distributed a press release expressing their outrage against Googles latest venture.
The release stated: Google announced a new
feature of its Google Desktop software that
greatly increases the risk to consumer privacy.
If a consumer chooses to use it, the new Search
Across Computers feature will store copies of
the users Word documents, PDFs, spreadsheets,
and other text-based documents on Googles own
servers, to enable searching from any one of the
users computers. The release of and technology
driving Google Desktop is yet another precursor
to the potential future plans Google has in store
for data collection and targeted, intrusive advertising.
As if Googles reach was not far and impressive
enough, the next stop for Google might be in the
airwaves. Google has been known for some time to
be toying with the idea of becoming a full-blown
Internet service provider, in the form of a massive
WiMax or Wi-Fi network. This would give them
more leverage to further deliver targeted advert-



ing and appeal to an even larger base. The notion


came about early in 2005, when the city of San
Francisco sent out a request for information about
a free city-wide high-speed wireless network.
Google originally showed much interest.
PC World Magazine describes Googles pitch
to San Francisco as a system that would deliver
both free access, and the ability to upgrade to
faster access. PC World writer Stephen Lawson
explains that: Google proposed to build an IEEE
802.11b/g Wi-Fi mesh network that delivers more
than 1 megabit per second of capacity throughout
the city. Anyone in the city could get access free
at speeds as high as 300 kilobits per second, and
Google or third parties could sell access at higher
speeds, possibly as high as 3mbps.
By entering the world of Wi-Fi and WiMax,
Google secures itself a new stream of potential
income, while showing off its might to other
media conglomerates. By having a free WiMax
network, Google also can get into more homes
and places that otherwise may not have ever
been able to afford Internet. The keyword, of
course, is free. Jeff Chester, writer for The
Nation Magazine explains that the costs of
operating the free service would be offset by
Googles plans to use the network to promote its
interactive advertising services. In other words,
Google generates even morefreetraffic, that
in turn uses its services, generates more data, and
makes Google more money. Google is currently
up against other big providers such as EarthLink,
who also wants to develop a massive Wi-Fi network in San Francisco.
Along with the potential implementation of
city-wide wireless networks comes the ability for
mobile phones to use these networks instead of
the traditional cellular networks. Imagine, for a
moment, placing a phone call with a friend some
years down the road in the future. The mobile
phone is any one of your typical brand name
phones, but it is not one that belongs exclusively
to any network like Verizon, Sprint, or Cingular.
Instead, it uses voice over IP technology over a

Google

nationwide, ubiquitous Google Wi-Fi network.


The phone call is going over the Internet, much
the same as it can today. Presently, VoIP-based
phones must be attached to a computer or used
wirelessly within close proximity of an access
point. The ability to freely roam does not exist.
In the future, the phone in the mentioned example
making this phone call may be getting used at
the beach, in a restaurant, or in a department
storemuch like an ordinary cell phone. With
high-speed WiMax and city-wide Wi-Fi on the
horizon, cell phones that rely on VoIP have the
potential to become a popular commodity.
Now imagine for a moment a conversation taking place over this network. The conversation has
to do with getting an ice-cream cake for a friends
upcoming birthday. If the Wi-Fi network or VoIP
network the phone is hooked into is run by Google,
for instance, they may have similar technology to
scan verbal words in the same fashion as typedwords on Gmail. As soon as ice cream is said
in the conversation, a small tone beeps, lighting
up the phones screen. On the screen is the nearest
location for an ice cream parlor. If Google can scan
what users type in their personal e-mails, why
not move into audible conversations? By offering
such telecommunications services for free (or
significantly cheaper than the cellular alternative),
Google again increases their customer base and
exposes more users to their network of advertisements. The sacrifice of privacy for convenience
will evolve with the technology.

In google We trust
In a world where technology keeps creating conveniences, we must sometimes hit the brakes so we
can make sure nothing is getting missed. Google
is amassing one of the largest collections of search
data in history. It remains to be seen whose hands
this data will end up in, or how Google will use
it to their maximum advantage. Services like AdWords and Gmail are slowly pushing the notions

of invasive and extremely targeted advertising,


and future innovations are sure to bring even more
invasive advertising. While the vast majority of
casual Internet users either do not know Googles
data collection policies, or simply do not care, at
the end of the day it comes down to the simple
fact that we as a society must put our trust into
the technological innovations that have become
commonplace conveniences.
We live in an interesting time. Wires are disappearing and high-speed Internet will be in the
air all around us. Media conglomerates will go
up against each other for control of the wireless
Internet airwaves. Online giants will migrate
off-line, and more off-line entities will move
online. Privacy will always be a question, but
smart surfing and awareness can help to reduce
unwanted intrusion. Stephen OGrady, the blogger
behind the Tecosystems blog, was quoted by PC
Magazine writer Cade Metz: Google is nearing
a crossroads in determining its future path. They
can take the Microsoft forkand face the same
scrutiny Microsoft does, or they can learn what
the folks from Redmond have: Trust is hard to
earn, easy to lose and nearly impossible to win
back.

references
Bankston, K. (2006, February 9). Press releases:
February, 2006 | electronic frontier foundation.
Retrieved November 11, 2006, from http://www.
eff.org/news/archives/2006_02.php
Brandt, D. (n.d.). Google as big brother. Retrieved
November 11, 2006, from http://www.googlewatch.org
Carr, D. (2006, July 6). How Google works. Retrieved November 17, 2006, from http://www.baselinemag.com/article2/0,1397,1985040,00.asp
Chester, J. (2006, March 26). Googles wi-fi privacy ploy. Retrieved November 14, 2006, from
www.thenation.com/doc/20060410/chester



Google

Gage, D. (2005, March 7). Shadowcrew: Web


mobstimeline: Cybercrime. Retrieved November 1, 2006, from http://www.baselinemag.
com/article2/0,1397,1774786,00.asp
Garfinkel, S. (2002). Web security, privacy &
commerce (2nd ed.). Sebastopol, CA: OReilly
& Associates.
Google corporate information: Google milestones. (n.d.). Retrieved November 25, 2006, from
http://www.google.com/corporate/history.html
Google privacy center: Privacy policy. (n.d.).
Retrieved November 10, 2006, from http://www.
google.com/privacy.html
Googles advertising footprint. (2007, June 14). Retrieved July 21, 2007, from http://www.eweek.com/
slideshow/0,1206,pg=0&s=26782&a=209549,00.
asp
Hinman, L. (2006, March 16). Why Google matters. Retrieved November 7, 2006, from http://
www.signonsandiego.com/uniontrib/20060316/
news_lz1e16hinman.html
Internet usage world statsInternet and population. (n.d.). Retrieved November 12, 2006, from
http://www.Internetworldstats.com
Lawson, S. (2006, Nov. 29). Google describes its
wi-fi pitch. Retrieved December 1, 2006, from
://www.pcworld.com/article/id,123157-page,1/
article.html
Metz, C. (2003, February 27). Is Google invading your privacy? Retrieved December
2, 2006, from http://www.pcmag.com/article2/0,4149,904096,00.asp
Mills, E. (2005, July 14). CNET.com. Retrieved
November 7, 2006, from news.com.com/
Google+balances+privacy,+reach/2100-1032_3Report to Congressional Requestors. (2005). Information security emerging cybersecurity issues
threaten federal information systems. Retrieved



December 12, 2006, from http://www.gao.gov/


new.items/d05231.pdf
Salehnia, A. (2002). Ethical issues of information systems. Hershey, PA: Idea Group Incorporated.
Thompson, B. (2003, February 21). Is Google too
powerful? Retrieved December 12, 2006, from
http://news.bbc.co.uk/2/hi/technology/2786761.
stm
Traffic details for Google.com. (n.d.). Retrieved
November 11, 2006, from http://www.alexa.
com/data/details/traffic_details?q=www.google.
com&url=google.com
Vogelstein, F. (2007, April 9). Text of wireds
interview with Google CEO Eric Schmidt.
Retrieved July 15, 2007, from http://www.
wired.com/techbiz/people/news/2007/04/mag_
schmidt_trans?curren

AddItIonAl reAdIng
Battelle, J. (2005). The search: How google and
its rivals rewrote the rules of business and transformed our culture. (Portfolio Hardcover ISBN
1-59184-088-0).
Brin, S., Motwani, R., Page, L., & Winograd, T.
(1999). The PageRank citation ranking: Bringing
order to the Web. Retrieved from http://dbpubs.
stanford.edu:8090/pub/showDoc.Fulltext?lang=e
n&doc=1999-66&format=pdf&compression=
Brin, S., & Page, L. (1998). The anatomy of a
large-scale hypertextual Web search engine.
Retrieved from http://dbpubs.stanford.edu:8090/
pub/1998-8
Electronic Frontier Foundation (EFF): http://www.
eff.org/issues/privacy
Google PageRank Patent: http://patft.uspto.gov/
netacgi/nph-Parser?patentnumber=7058628

Google

Google Privacy Policy: http://www.google.com/


privacy.html

Search Engine Land Blog: http://searchengineland.com/

Google-Watch.org: http://www.google-watch.
org

Search Engine Watch: http://searchenginewatch.


com/

Malseed, M., & Vise, D. (2005). The Google story.


Delacorte Press.

World Privacy Forum: Search Engine Tips: http://


www.worldprivacyforum.org/searchengineprivacytips.html





Chapter II

A Taxonomic View of
Consumer Online Privacy Legal
Issues, Legislation, and
Litigation
Angelena M. Secor
Western Michigan University, USA
J. Michael Tarn
Western Michigan University, USA

AbstrAct
In this chapter, consumer online privacy legal issues are identified and discussed. Followed by the literature review in consumer online privacy legislation and litigation, a relational model is presented to
explore the relationship of the issues, legal protections, and the remedies and risks for not complying
with the legal requirements. Two survey studies are used to reinforce the vital need for a stronger role
by the government and business community as well as the privacy awareness from online consumers
themselves. This chapter is concluded with a vital call for consumer privacy education and awareness
and government and legislators attention and timely responses with legislation that protects consumers
against those who would misuse the technology.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

IntroductIon
Information privacy is defined as the right of individuals to control information about themselves
(Richards, 2006). As the Internet becomes more
popular and more people are using it as a daily
means of communication, information sharing,
entertainment, and commerce, there are more
opportunities for breaches of privacy and malicious intent attacks. There have been numerous
bills introduced in the House of Representatives
and the Senate in recent years attempting to legislate protections for consumers regarding online
privacy. Many of these attempts at legislation fail
to become laws. This study aims to examine consumer online privacy legal issues, recent litigation
topics, and the present active legislation. The topic
will be of interest because some of the legislation
does not provide more consumer protection but is
instead taking away consumer privacy such as the
USA Patriot Act and the Homeland Security Act
enacted after the terrorist attacks of September
11, 2001. These laws give government more access to private information instead of providing
consumers with increased protections.
Some relevant privacy issues are underage
consumer protections, health information privacy,
lack of consumer control over information stored
in databases, information security breaches, and
identity theft. Recent litigation in the United States
in the information security area has been over the
lack of protection over the information gathered
and stored by companies from consumers. The
Federal Trade Commission (FTC) has initiated
lawsuits against companies not providing the
level of information protection they should. The
FTC charged Petco with Web site security flaws
that allowed a structured query language (SQL)
injection attacker to gain consumer credit card
information (FTC File No. 032 3221, 2004). The
FTC also charged BJs Wholesale Club with failing
to secure credit card magnetic stripe information
appropriately (FTC v. BJs Wholesale Club, Inc,
2005). There was also a class action suit filed on

behalf of Banknorth, N.A. (Visa and Mastercard)


charging BJs Wholesale Club that hackers
gained access to credit card information of cardholders and used the information fraudulently
(FTC v. BJs Wholesale Club, Inc, 2005). These
instances are examples of companies failing to
take proper measures to secure consumer information. The stolen personal information could have
been gathered through an online interaction or a
personal visit to the company. These examples
show that it does not matter how a consumer
interacts with a company, either on the Web, in
person, or on the phone, the company stores the
information they gather in databases on their
systems and all of the information is a target.
Current laws relating to consumer online privacy and protections are the U.S. Safe Web Act
of 2006, the USA Patriot Act of 2001, Homeland
Security Act: Cyber Security Enhancement Act
of 2001, the Federal Privacy Act of 1974, the
Childrens Online Privacy and Protection Act of
2000, and the Health Insurance Portability and
Accountability Act of 1996. The points pertaining to consumer privacy and protection will be
included. Not all parts of the laws may be applicable to the subject of this chapter.
In the following section, consumer online
privacy legal issues are identified and discussed.
Followed by the literature review in consumer
online privacy legislation and litigation, the
authors present a relational model to explore the
relationship of the issues, legal protections, and
the remedies and risks for not complying with the
legal requirements. Two survey studies are used
to reinforce the vital need for a stronger role by
the government and business community as well
as the privacy awareness from online consumers
themselves. This chapter is concluded with a vital
call for consumer privacy education and awareness and government and legislators attention
and timely responses with legislation that protects
consumers against those who would misuse the
technology.



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 1. Research in consumer online privacy legal issues


Year

Author

2006

Swartz, Nikki

Health Information
Privacy

Issue

A survey of 1,117 hospitals and health systems that was conducted in January by
the American Health Information Management Association (AHIMA) found that
compliance with the three-year-old federal rules governing the privacy of patients
medical records declined in the past year. --The survey results also suggest that
patients are becoming more concerned about the privacy of their medical records.
According to 30% of respondents, more patients are asking questions.

Contribution

2006

Vasek, S.

Information Privacy

Sensitive, confidential information found in court and probate records, deeds,


mortgages, death certificates, and other civic records is posted online --The information is available because the records containing it are, by law, public records
filed according to provisions of state statutes and maintained at taxpayer expense.
In many instances these records contain health information, social security and
medicare numbers, birth dates, bank account information, prescription numbers,
and information that could be used to steal individuals identity.

2004

Milne et al.

Information Security
Breaches

Directly hacking into company databases and stealing personal or financial data,
such as consumer credit card or social security information

2004

Milne et al.

Identity Theft

Using techniques such as IP spoofing and page jacking


Using e-mail and a bogus Web page to gain access to individuals credit card data
and steal thousands of dollars from consumers
Cyber-thieves who were able to access tens of thousands of personal credit reports
online

2004

Milne et al.

Spyware, Malware,
Viruses & SPAM

The installation of spyware distributed as viruses attached to e-mail makes it


possible for third parties to view the content of a consumers hard drive and track
movement through the Internet.
Cookies that allow others to track clickstream history

2003

Bagner et al.

Underage Consumer
Protection

The lack of super-vision while online exacerbates a childs vulnerability to online


violations of privacy. A 1998 survey revealed that 36% of parents admitted that
they never supervised their childs use of or access to the Internet.

consumer onlIne prIvAcy


legAl Issues
Table 1 summaries the major research studies in
consumer online privacy issues which contain
the following six categories: information security
breaches, information privacy breaches, identity
theft and pre-texting, health information privacy,
underage consumer protection, and spyware, malware, viruses, cookies, and SPAM. The following
subsections discuss these six categories.

Information security breaches


Information security breaches include events such
as hacker or SQL injection attacks on business or
institutional networks resulting in stolen personal,
financial, or medical information, interceptions of

8

e-mail by unauthorized parties, and any breach


of security not withstood by current security
practices. Each of these security breaches can
be disastrous for consumers. In the instance of
the FTC v. Petco, an unauthorized person outside the company was able to successfully break
into the Petco information system through their
Web site using structured query language (SQL)
attacks and gained access to customer personal
and credit card information that was being stored
on the system in an unencrypted format (FTC
File No. 032 3221, 2004). This security breach
resulted in the compromise of customer information in the custody of the business that was not
reasonably and appropriately protected against
external threats.
Besides information being threatened by
external entities, it can be threatened by internal

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

entities as well. Security breaches by employees or agents of companies retaining consumer


information occur all too often. Although there
are not good statistics to quantify the amount of
loss, this is still a threat to information security.
Businesses need to studiously monitor employee
access to sensitive information and have enforceable privacy policies in effect. Policies should
include what information access is necessary for
job positions, how the information is handled and
disposed of if necessary (shredded), and removal
of access immediately upon employee dismissal.
Policies may also include minimum password
security measures.
Also included in information security breaches
is the interception of e-mail or unencrypted files.
According to AnonIC.org e-mail can be intercepted at each step along the way. (The) E-mail
message is stored on two servers on its way at
least: on sender ISP mail server and on recipient
ISP mail server. When traveling through the MX
hosts, message is stored on each of MX hosts (Email Security and Anonymity, 2004). Depending
on where the e-mail message is going, it could
be stored on many servers along the way. There
is opportunity at many points for an experienced
attacker to gain access to email messages and files
attached to the email messages.

Information privacy breaches


Information privacy breaches include any release
of information by a business or institution that is
not compliant with privacy practices defined by
law or within the business privacy statement. Such
occurrences could be purely accidental, but the
occurrence just highlights the need for business
and institution policies to pay careful attention
to the laws and to make sure harm to consumers
is not a result of their actions.
An example of an information privacy breach
would be the selling of customer information to
third-party vendors without notification to the
consumer through a privacy statement that such

a release of information could occur. In 2003 the


FTC brought changes against CartManager, Inc.,
a Web-based shopping cart software company, for
selling to third parties the personal information
of nearly one million consumers. The FTC
asserted that consumers reasonably expected
that the merchants privacy policies, rather than
CartManagers, covered information consumers
provided on the shopping cart and check out
page of the merchants Web sites (Ambrose &
Gelb, 2006).

Identity theft and pre-texting


Identity theft is when someone uses anothers
personally identifying information, like names,
social security numbers, or credit card numbers,
without permission, to commit fraud or other
crimes (ftc.gov, 2006). Online consumer privacy
issues such as identity theft can occur through
many channels. One example is phishing, where
consumers receive unsolicited e-mails requesting
information, and they unwittingly provide private
information, which is then used maliciously. The
theft can also occur when a company does not
secure electronic information appropriately or
when employees steal the information. Identity
theft can also occur in an off-line situation where
documents containing personal information are
not disposed of properly by either a business or
consumer themselves. Many businesses have
policies regarding disposal of confidential or sensitive information, specifying that the documents
must be shredded, but there are many instances
in the recent past where businesses have been
caught not complying with their policies and
documents containing personal information were
not shredded.
Consumers themselves must take steps to
prevent identity theft (see suggestions on FTC.gov
Web site). In addition to common ways thieves
steal identities, there is an activity called pretexting. According to the FTC (FTC.gov, 2006),
pre-texting is the practice of getting your personal

9

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

information under false pretenses, which uses a


variety of tactics to obtain personal information.
For example, a pre-texter may call, claim he is
from a research firm, and ask you for your name,
address, birth date, and social security number.
When the pre-texter has the information he wants,
he uses it to call your financial institution. He
pretends to be you or someone with authorized
access to your account. He might claim that he
has forgotten his checkbook and needs information about his account. In this way, the pre-texter
may be able to obtain other personal information
about you such as your bank and credit card account numbers, information in your credit report,
and the existence and size of your savings and
investment portfolios (FTC.gov, 2006).

health Information privacy


Health information privacy is of great concern
to many consumers. HIPPA, the Health Insurance Portability and Accountability Act of 1996,
legislation requires healthcare providers to adopt
standards designed to facilitate the development of
a uniform, computer-based healthcare information
system, while protecting the security of the information and the privacy of its subjects (Michael,
2001). The concern comes in having personal
health information available in a computer-based
healthcare information system. If the information
is housed in a computer network, then there are
concerns over security of the information. The
legislation demands security of the information,
but even with great security, information can still
be misused, stored improperly, accessed inappropriately, and disposed of incorrectly as with
any electronic information.
According to OCR (2005), the Privacy Rule
protects all individually identifiable health
information held or transmitted by a covered
entity or its business associate, in any form or
media, whether electronic, paper, or oral. The
Privacy Rule calls this information protected
health information (PHI). HIPPA also provides

0

provisions of the law-standards for electronic


transactions and standards for the privacy of
individually identifiable health information
(Michael, 2001). The HIPPA legislation demands
providers use electronic transactions in addition
to using computer-based information systems
which essentially covers all facets of healthcare
interactions within their organization and with
external entities. In addition, HIPPA regulates
the use and disclosure of personally identifiable
information and demands patients be given a
privacy statement detailing how the provider or
institution will use their information.

underage consumer protection


The growth of computer use in school systems is
educating children in the use of computers and the
Internet. Children then have knowledge of how
to browse the Internet and interact with online
entities. In addition, many children have access
to computers at home and can use the Internet
from there as well. School computers have content
blocking software installed where the school can
filter the content returned for online searches and
block access from Web sites known to contain
inappropriate content for the safety of the children, but the same content filtering software may
not be installed on every computer accessible by
children. In addition to access concerns, there
are concerns over how the online entities with
which children interact behave. According to
FTC (FTC.gov, 2007), Congress enacted COPPA
(Childrens Online Privacy Protection Act) in 1998
to address privacy and security risks created when
children under 13 years of age are online. COPPA
imposes requirements on operators of Web sites
and online services directed to children, as well
as other operators with actual knowledge that
they have collected personal information from
children.
The primary goal of COPPA and the Rule is
to place parents in control over what information
is collected from their young children online. The

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Rule was designed to protect children under age


13 while accounting for the dynamic nature of
the Internet. The Rule applies to operators of
commercial websites and online services directed
to children under 13 that collect, use, or disclose
personal information from children, and operators
of general audience Web sites or online services
with actual knowledge that they are collecting,
using, or disclosing personal information from
children under 13 (FTC.gov, 2007).
In addition, the Rule prohibits operators from
conditioning a childs participation in an online
activity on the childs providing more information
than is reasonably necessary to participate in that
activity (FTC.gov, 2007).

spyware, malware, viruses, cookies,


phishing, and spAm
Spyware is software installed on a computer
without the owners consent, which monitors or
controls the owners computer use to send pop-up
ads, redirect the computer to Web sites, monitor
Internet surfing, or record the keystrokes, which,
in turn, could lead to identity theft (FTC.gov Spyware, 2007). There are several ways for a computer
to get infected with spyware. Some of the most
frequent are: downloading free software that
contain the spyware, unauthorized downloads that
occur without consumer knowledge when Internet
security is set to less than medium, links within
pop-up ads can contain spyware, and opening
a spam e-mail or attachment can download the
spyware (FTC.gov Spyware, 2007).
Malware includes adware, hijackers, toolbars,
and dialers. Baratz and McLaughlin (2004) define
malware as follows:

Adware is the class of programs that place


advertisements on a computer screen. These
may be in the form of pop-ups, pop-unders,
advertisements embedded in programs,
advertisements placed on top of ads in Web
sites, or any other way the authors can think

of showing the victim an ad. The pop-ups


generally will not be stopped by pop-up
stoppers, and often are not dependent on the
victim having Internet Explorer open. They
may show up when the victim is playing
a game, writing a document, listening to
music, or anything else. Should the victim
be surfing, the advertisements will often be
related to the Web page he or she is viewing.
Hijackers take control of various parts of
victims Web browsers, including their
home page, search pages, and search bar. A
hijacker may also redirect a victim to certain
sites should the victim mistype an address
or prevent him or her from going to a Web
site the hijacker would rather the victim not,
such as sites that combat malware. Some will
even redirect a victim to a hijackers own
search engine when the victim attempts a
search.
Toolbars plug into Internet Explorer and
provide additional functionality such as
search forms or pop-up blockers. The Google
and Yahoo! toolbars are probably the most
common legitimate examples, and malware
toolbars often attempt to emulate their
functionality and look. Malware toolbars
almost always include characteristics of the
other malware categories. Any toolbar that is
installed through underhanded means falls
into the category of malware.
Dialers are programs that set up a victims
modem connection to connect to a 1-900
number. This provides the number's owner
with revenue while leaving the victim with
a large phone bill. Most dialers are installed
quietly and attempt to do their dirty work
without being detected.

As defined by the National Consumers League


(NCL), phishing is using the Internet to fraudulently gather personal data about a consumer
(Fraud.org). According to a report published by



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

the NCL, the bait is an e-mail claiming to be


from a trusted organization, such as a bank or
an online retailer. The e-mail often claims that
the consumer must urgently take action, or else
a bad thing will occur such as closure of the account. Once the phisher sends spam with bait the
next step is that the e-mail provider delivers bait
to consumer. Next, the user reads bait. A user
might respond directly to the e-mail, shown as
user enters info. More often, the user clicks on
spoofed link. The link is typically to a Web site
controlled by the phisher. The Web site is designed
to seem like the site of the trusted company. The
consumer then enters personal information, such
as account number, password, or social security
number. When the user enters info on spoofed
site the phishing attack has succeeded at its first
goal, to gather personal information fraudulently.
Next the personal information is used to harm
the consumer, when the bad guy selects victims
and attempts fraud. Important examples of fraud
are if the phisher commits bank fraud, such as
by hijacking the consumers account, or credit
card fraud, by using the personal information to
purchase goods fraudulently (Fraud.org).
cookie when a consumer visits a site, a notation may be fed to a file known as a cookie in
his or her computer for future reference. If the
person revisits the site, the cookie file allows the
Web site to identify him or her as a return guest
and offers him or her products tailored to his or her
interests or tastes. The consumer can set his or her
online preferences to limit or let him or her know
about cookies that a Web site places on his or her
computer (FTC.gov Spyware, 2007). Marketers
want to customize a consumers experience at their
Internet store so they collect personal information
from the consumers computer cookies when a
consumer enterers their Web space. They use this
information to target their marketing specifically
to the individual. The companies collect this information store it in their database and can even
sell the information to other companies.



Spam is the receipt of unsolicited e-mail. The


CAN-SPAM Act of 2003 (Controlling the Assault
of Non-Solicited Pornography and Marketing Act)
establishes requirements for those who send commercial e-mail, spells out penalties for spammers
and companies whose products are advertised in
spam if they violate the law, and gives consumers
the right to ask e-mailers to stop spamming them.
The law, which became effective January 1, 2004,
covers e-mail whose primary purpose is advertising or promoting a commercial product or service,
including content on a Web site. A transactional
or relationship messagee-mail that facilitates
an agreed-upon transaction or updates a customer
in an existing business relationshipmay not
contain false or misleading routing information,
but otherwise is exempt from most provisions of
the CAN-SPAM Act (FTC.gov, 2004).

consumer onlIne prIvAcy


Issues, legIslAtIon, lItIgAtIon
& theIr Inter-relAtIonshIps
As the six legal categories in consumer online
privacy were examined, the solutions to these
issues are further investigated in terms of legislation and litigation. According to the literature
review, the results were summarized in Table 2
and Table 3.
Based on the literature research in the legal
issues, legislation and litigation of consumer
online privacy, the relational model of consumer
online privacy as shown in Figure 2 is developed,
which exhibits the relationship between the issues consumers have with online privacy and
how those issues are addressed with protections
provided by legislation and then the remedies
for consumers and risks to companies for not
complying with the legal requirements. Some of
the legislative works provide coverage for more
than one consumer online privacy categories. For
instance, the Federal Privacy Act and Fair and
Accurate Credit Transactions could be used to

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 2. Consumer online privacy legislation


Year

Law

Applications

Contribution

2006

U.S. Safe Web Act

Undertaking Spam, Spyware,


And Fraud Enforcement With
Enforcers Beyond Borders Act of
2005 or the U.S. SAFE WEB Act
of 2005

Includes but not limited to:


Amends the Federal Trade Commission Act to include within the
definition of unfair or deceptive acts or practices, those acts or
practices involving foreign commerce
Authorizes the FTC to disclose certain privileged or confidential
information to foreign law enforcement agencies

2003

Fair and Accurate


Credit Transactions Act

To amend the Fair Credit


Reporting Act, to prevent identity theft, improve resolution of
consumer disputes, improve the
accuracy of consumer records,
make improvements in the use
of, and consumer access to, credit
information

Includes but not limited to:


Identity theft prevention
Protection and restoration of identity theft victim credit history
Improvements in use of and consumer access to credit information
Enhancing the accuracy of credit information
Limiting the use and sharing of medical information in the
financial system
Financial literacy and education improvement funds

2003

CAN-SPAM Act

Controlling the Assault of


Non-Solicited Pornography and
Marketing Act

Includes but not limited to:


Bans false or misleading header information
Prohibits deceptive subject lines
Requires the e-mail provide an opt-out method
Requires that commercial e-mail be identified as an advertisement and include the senders valid physical mailing address

2002

Federal Information Security


Management Act

bolster computer and network


security within the Federal
Government and affiliated parties
(such as government contractors)
by mandating yearly audits

Includes but not limited to:


FISMA imposes a mandatory set of processes that must be followed for all information systems used or operated by a U.S.
government government federal agency or by a contractor or other
organization on behalf of a U.S. government agency. These processes must follow a combination of federal information processing
standards (FIPS) documents, the special publications SP-800 series
issued by NIST

2001

USA Patriot Act

Uniting and Strengthening


America by Providing Appropriate Tools Required to Intercept
and Obstruct Terrorism

Includes but not limited to:


Expands the scope of subpoenas for records of electronic communications to include the length and types of service utilized,
temporarily assigned network addresses, and the means and
source of payment (including any credit card or bank account
number).
Permits electronic communication and remote computing service
providers to make emergency disclosures to a governmental entity
of customer electronic communications to protect life and limb.
Makes it lawful to intercept the wire or electronic communication of a computer trespasser in certain circumstances.
Provides for nationwide service of search warrants for electronic
evidence.

2001

Homeland Security Act, Cyber


Security Enhancement Act

Enhanced Cyber Security and


Protections for Companies Assisting in Investigations

Includes but not limited to:


Directs the Attorney General to establish and maintain a National
Infrastructure Protection Center to serve as a national focal point for
threat assessment, warning, investigation, and response to attacks on
the Nations critical infrastructure, both physical and cyber.
Prohibits the distribution of advertisements of illegal interception
devices through the Internet as well as by other, specified media.
Broadens the offense of and increases the penalties for illegally
intercepting cell-phone conversations or invading the privacy of another persons stored communications. States that a law enforcement
officer need not be present for a warrant to be served or executed
under the Electronic Communications Privacy Act.

continued on following page



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 2. continued
Year

Law

Applications

Contribution

2000

Childrens Online
Privacy Protection
Act

Specifically protects the privacy


of children under the age of 13
by requesting parental consent
for the collection or use of any
personal information of the
users

Includes but not limited to:


Incorporation of a detailed privacy policy that describes the
information collected from its users.
Acquisition of a verifiable parental consent prior to collection of
personal information from a child under the age of 13.
Disclosure to parents of any information collected on their children by the Web site.
A right to revoke consent and have information deleted.
Limited collection of personal information when a child participates in online games and contests.
A general requirement to protect the confidentiality, security, and
integrity of any personal information that is collected online from
children.

1996

Health Insurance
Portability and
Accountability
Act

Establishes, for the first time,


a set of national standards for
the protection of certain health
information

Includes but not limited to:


The Privacy Rule protects all individually identifiable health
information held or transmitted by a covered entity or its business
associate, in any form or media, whether electronic, paper, or oral.
The Privacy Rule calls this information protected health information
(PHI). Individually identifiable health information is information,
including demographic data, that relates to:
The individuals past, present, or future physical or mental health
or condition,
The provision of health care to the individual, or
the past, present, or future payment for the provision of health
care to the individual, and that identifies the individual or for
which there is a reasonable basis to believe can be used to identify the individual. Individually identifiable health information
includes many common identifiers (e.g., name, address, birth date,
social security number).

1974

Federal Privacy
Act

Created in response to concerns


about how the creation and use
of computerized databases might
impact individuals privacy
rights

Includes but not limited to:


Safeguards privacy through creating four procedural and substantive rights in personal data.
It requires government agencies to show an individual any
records kept on him or her.
It requires agencies to follow certain principles, called fair information practices, when gathering and handling personal data.
It places restrictions on how agencies can share an individuals
data with other people and agencies.
It lets individuals sue the government for violating its provisions

Table 3. Consumer online privacy litigation


Year

Case

Issues

2007

FTC v. Albert

Spyware: the code interfered with the functioning of the computer, and was difficult for
consumers to uninstall or remove. In addition,
the code tracked consumers Internet activity,
changed their home page settings, inserted new
toolbars onto their browsers, inserted a large
side frameor window onto browser windows that in turn displayed ads, and displayed
pop-up ads, even when consumers Internet
browsers were not activated

Contribution
Permanently bars him from interfering with consumers computer use, including distributing software code
that tracks consumers Internet activity or collects other
personal information, changes their preferred homepage
or other browser settings, inserts new toolbars onto their
browsers, installs dialer programs, inserts advertising
hyperlinks into third-party Web pages, or installs other
advertising software. It also prohibits him from making
false or misleading representations; prohibits him from
distributing advertising software and spyware; and requires he perform substantial due diligence and monitoring if he is to participate in any affiliate program

continued on following page



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Table 3. continued
Year

Case

Issues

Contribution

2006

FTC v. Petco

Web site security flaws allowed SQL injection


attacks to gain consumer credit card information

Must construct a system-wide security plan that will


adequately protect consumers from security breaches via
the companys Web site
Will be subject to a biennial security audit from a third
party auditor to validate their effectiveness
Prevents Petco from misrepresenting its security system
to consumers in the future and mandates record keeping
provisions to permit the FTC to monitor compliance

2006

FTC v. BJs
Wholesale
Club

Collection of magnetic stripe information from


consumers and failing to secure the information
appropriately

Must implement and maintain an information security


program that is reasonably designed to protect the security, confidentiality, and integrity of personal information
collected from consumers
Program must assess internal and external threats to
security and design reasonable safeguards to control the
risk identified
Must obtain independent assessment and report of security program within 180 days of the order and biennially
for 20 years

2006

Banknorth,
N.A. v. BJs
Wholesale
Club

Unauthorized parties, hackers, gained access


to credit card information of cardholders and
used the information fraudulently

Establish and implement, and thereafter maintain, a


comprehensive information security program that is
reasonably designed to protect the security, confidentiality, and integrity of personal information collected from
or about consumers
Obtain an assessment and report (an Assessment) from
a qualified, objective, independent third-party professional, using procedures and standards generally accepted
in the profession, within one hundred and eighty (180)
days after service of the order, and biennially hereafter for
twenty (20) years after service
Maintain, and upon request make available to the
Federal Trade Commission for inspection and copying,
a print or electronic copy of each document relating to
compliance

2005

FTC v.
Cartmanager
International

Engaged in unfair trade practices by selling to


third parties the personal information of nearly
one million customers
Web pages collect consumer name, billing
and shipping address, phone number, e-mail
address, and credit card information
Violated merchants privacy policies which
provided that the personal information collected through their Web sites would not be
sold, traded, or rented to third-parties

Cannot sell, rent, or disclose to any third party for


marketing purposes any personally identifiable information collected from consumersprior to the date of
order
Can sell, rent, or disclose such information after date or
order provided there is a clear and conspicuous written
notice of and they obtain a written certification that merchant privacy policy allows the sale, rental, or disclosure, OR prior to collecting personal data the merchant
discloses the consumer is leaving the merchants Web site
and subject to Cartmanagers privacy policy

litigate a breach in information security along with


the Federal Information Security Management
Act. In addition, many of the privacy categories
are closely related or resultive of each other. An
attacker could use spyware to gain information
about a consumer that may lead to identity theft,
or identity theft could be a result of an attacker
gaining access to consumer information through

an information security breach event on a business. Identity theft could be a result from most
of the categories, but could also occur without
any interaction with an online entity through
non-shredding of personal documents therefore,
it is a category in and of itself.
The model illustrates a top-down view of
the consumer online privacy protection flow.




health Information
privacy p ers on ally
id entifiab le inform ation
b ein g releas ed

fair & Accurate credit


transactions Act id entity
th eft preven tion an d c redit
res toration f or vic tim s
health Insurance
portability and
Accountability Act
es tab lis h ed n ation al
s tan d ards for c ollection an d
releas e of p ers on al h ealth
inf orm ation

legislation government provided protections

Identity theft & pretexting p ers on al


inf orm ation or c redit c ard
inf orm ation b ein g s tolen to
im p ers on ate c ons um er

childrens online privacy


protection Act req uirin g
p aren tal c ons ent for
c ollec tion or us e of
p ers on al inf orm ation f or
th os e u nd er 13 yrs . of ag e

underage consumer
p ers on al inf orm ation b ein g
releas ed b y you th

ftc v. petco 2006 S Q L


injec tion attac ks g ain ed
c ons u m er c red it c ard
nu m b ers

Register complaint at
ftc.gov

ftc v. bJs Wholesale


club 2006 f ailin g to
s ec ure c us tom er m ag n etic
c redit c ard inf orm ation

Register complaint at
ftc.gov

felony hIppA violation


w om an c on vic ted on f elon y
c h arg es f or H IP P A violation
of s ellin g p atient
inf orm ation to dru g d ealers

Notify law enforcement


and register a complaint
at ftc.gov

Notify law enforcement


and Health and Human
Services

consumer Actions Available

ftc v. bJs Wholesale


club 2006 h ac k ers
g ain ed ac c es s to c red it
c ard inf orm ation an d us ed
it frau du lentl y

Register complaint at
ftc.gov

ftc v. Xanga.com
c ollec ted and d is c los ed
un d erag e p ers on al
inf orm ation w ith ou t p arental
c ons en t

litigation risks for companies and remedies for consumers

federal privacy Act


privac y s af eg u ards f or
inf orm ation s tored in
c om p uteriz ed d atab as es

federal Info security


mgmt Act b ols ter
g overn m ent an d affiliates
c om p uter an d n etw ork
s ec urity

homeland security Act


en h anc e c yb er s ec urity
protec tions f or c om p anies
as s is tin g in in ves tig ations

Information privacy
s ens itiv e or c onfid ential
inf orm ation releas ed
in a p p ro p riatel y

Information security
breaches h ac king ,
s tealin g p ers on al
inf orm ation, or Intern et or
e-m ail interc eptions

C onsum er O nline P rivacy C ategories

Register complaint at
ftc.gov

ftc v. Albert 2007 c od e


trac k ed ac tivity, c h an g ed
s ettings , an d w as u nrem o vab le

cAn-spAm Act
c ontrollin g n on-s olic ited em ail ag ains t b ein g
m is lead in g or d ec e p tive

us safe Web Act


en h anc in g fraud
enf orc em ent ag ains t
s p y w are an d s p am

spyware, malware,
viruses, cookies, & spAm
to trac k c lic ks , h is tory, or
ins tall m alic ious program s

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Figure 1. Consumer online privacy relational model

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

The following two survey studies examine consumer online behaviors and privacy concerns,
reinforcing the vital need for a stronger role by
the government and business community as well
as the privacy awareness from online consumers
themselves.

survey studIes
survey study I: consumers
protection of online privacy and
Identity
The study by Milne, Rohm, and Bahl (2004)
examined results from three consumer surveys
wherein attitudinal, behavioral, and demographic
antecedents that predict the tendency to protect
ones privacy and identity online were explored.
The research looks at online behaviors increasing or reducing risk of online identity theft and
indicates the propensity to protect oneself form
online identity theft varies by population. The
survey findings are summarized as follows:

There was a positive significant relationship between privacy concern and active
resistance.
Those who had bought online, provided
e-mail, and registered for a Web-site had
higher rates of protection and higher number of hours on the Web.
Males were more likely to protect their
information online than females.
Protection behavior increased with years of
schooling.
Younger online adults were more vigilant
than older adults in protecting information
online.

Overall, the study found that consumers are not


protecting themselves adequately, and therefore
there is a need for a stronger role by the government and business community to combat identity

theft. The researchers suggest the government


enact new lawsto more effectively monitor
business practices (Milne et al., 2004) and businesses take responsibility for security of sensitive
customer information in the off-line as well as
online context (Milne et al., 2004) and implement technological solutions including digital
certificates and signatures, biometrics, and other
authentication approaches (Milne et al., 2004). In
addition, they call for the public sector to expand
educational programs for consumers.
Furthermore, the study covers areas such as
identity theft, online privacy statements, and
discusses about updating virus protection often
and clearing cookies and computer cache after
browsing on the Internet, as would be expected
because these were surveys completed by various
online users. There are also other privacy concerns for consumers besides those covered in the
survey such as security breaches from hacking
and SQL injection attacks on their information
stored within business networks, unscrupulous
online Web sites collecting information from those
underage online users, and inappropriate release
of information not related to online transactions
by businesses or their agents.

survey study II: user privacy


concerns and control techniques
Chen and Rea (2004) examine the relationship
between two types of user privacy concerns
and how users control personal information.
The two types of privacy concerns addressed
are control of personal information and trust
of online businesses. The three privacy control
factors include:

Falsification of private information, which


includes altering ones personal information
and removing browser cookies to obtain access to online resources that require some
form of registration.



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Passive reaction, which includes ignoring


or deleting the unwanted presence.
Identity modification, which includes changing ones personal identity, gender-neutral
IDs, and multiple identities.

The study found that consumers do try to


protect themselves, but there is much yet to be
studied regarding online privacy. The fact is that
many consumers do not demonstrate sufficient
awareness of their online privacy, which require
better privacy controls.

Analysis and Implications


Consumers enjoy their surfing on the Internet to
learn everything, to buy anything, and to communicate with anyone with a computer connected
to the Internet. But how many net visitors are
concerned with their personal information that
may drift on the Internet when they are online
and are concerned about who may be looking at
their personal information?
The study by Milne et al. (2004) suggests
that consumers are more aware of the dangers in
providing sensitive information online. They are,
however, not taking adequate technical protection
and are often unable to differentiate between
a safe site and an unsafe site. Carefree online
buying behavior and ignorance about protection
tools provided by the browser, by the operating
system, and by the open source community are
some reasons, among others, behind lack of online consumer privacy protection. Chen and Rea
(2004) state that consumers appear to use three
types of privacy controls; that is, falsification,
passive reaction, and identity modification when
(a) there is a possibility that others might observe
and use the information (unauthorized secondary
use) and (b) concerns about giving out personal
information. The study also discusses privacy
protections such as encryption and anonymity.
In many cases, these tools are either not available
or simply infeasible for users.

8

The results of these two survey studies point


out that online consumers need to be educated
and encouraged to follow protective measures in
three areas; technical, educational, and behavioral.
However, there are responsibilities for the governments and the businesses as well. What is required
is a concurrent effort among all stakeholders to
protect privacy of all entities involved. The surveys
point out that the online privacy issue is global
rather than local. The researchers suggest the
government enact new laws to effectively monitor
business practices and businesses take responsibility for the protection of online consumers. Along
with laws and standards, available technological
solutions including digital certificates/signatures,
biometrics, and other authentication approaches
can be used.
Online privacy concerns started with the inception of the Internet. These concerns are amplified
tremendously with powerful computing ability to
gather and process vast amount of data which is
coupled with the Internets capacity for dispensing information instantly around the globe. In
short, success or failure of tackling this problem
depends on the willingness and joint effort of all
parties involved.

future trends
The growth of the online community does not
seem to be slowing. There are new and inventive
ways to use the Internet and the degree to which
consumers can interact online is also growing.
There are now ways to collaborate online that
did not exist just a few short years ago. This new
interaction online allows anyone to voice opinions
or work together. The success of the auction Web
site Ebay has been mostly due to the ability of
consumers to voice their opinion by rating their
online transactions. The rating system allows positive, negative, or neutral comments to be posted
regarding specific transactions. The rating and
comments are viewable by anyone and are used

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

to build trust with the online entity. These rating


systems are used by many other online entities
as a way for consumers to give other consumers
their opinions. This trend in allowing consumers
to rate their interactions seems to be a successful
model and will continue to grow in popularity.
Consumers are now demanding more control over
their online interactions and the rating systems
give a controlled venue for opinions. Consumers
are now relying on other consumers to determine
trust in an online business instead of relying solely
on the business.

Legislators need to be aware of what is happening within the technology area and respond
with legislation that protects consumers against
those who would misuse the technology. Each
year many proposals are introduced to further
define consumer privacy and protections, but
most do not become enacted into laws. Consumer
groups need to remain vigilant in their education
of legislators on what misuses of information are
occurring and why it is imperative they act.

future reseArch dIrectIons


conclusIon
Online privacy concerns are present in everyday
life for most consumers in the United States.
Each day there are millions of interactions
with businesses that store and use our personal
information. Each day businesses are trying to
find new ways to interact with consumers and
get ahead in their business. Each day there are
unsavory and unethical people trying to steal
consumers personal information. The balance
between convenience and safety in interacting
with an online entity is a struggle. Consumers
want to provide information to the online entity to
conveniently interact for their own purposes, but
want to control how the entity then uses, stores,
recycles, and disposes of the information as well.
The government is there enacting legislation to
try to define those gray areas, but there are still
so many instances where businesses are not complying with the laws. Consumers are left with the
consequences when a business fails to properly
handle their personal information. The call for
more consumer education is a necessary step in
trying to battle the privacy issues. Consumers
need to be aware of how a business is using their
information and be able to exert pressures on a
business to conform to the required standards of
society in an information age.

Consumer online privacy is an important subject


and will remain an important subject as long as
there are businesses collecting consumer information, online transactions, and hackers trying to
gain access to that information. Future investigation could include the following:

How can consumers protect their information? Finding ways to make opt-in and optout easier and giving consumers options
on how a business stores and handles their
information. Should consumers be able to
more specifically control the storage time
and be informed exactly what information
is being stored?
Investigating a new model for personal information online: creating a secure personal
profile with your information and when
interacting with online businesses, you
provide an encrypted transaction number
that authenticates to your profile and allows
a transaction to be created and stored under
your profile. You are identified by the transaction number and not a name to the online
business. While the business is still storing
the transaction, their records do not include
personally identifiable information.
Gaps in legislative protections for consumers: recommendations for legislative actions

9

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

on how to fill the gaps in the current legislations to bring them up to current business
model expectations. HIPAA was created
before offshore outsourcing began an upswing in healthcare. There are currently no
provisions in the rules for sending consumer
information beyond our borders.
Investigation within different age groups
on specific concerns about online privacy.
Younger generations have concerns, but
still interact online because it is part of their
culture. Other generations choose to not
interact online for a couple of reasons. They
may lack understanding of the technology
and/or their judgments are usually based
more on personal interactions to create trust.
How can they trust an online entity when
there is no personal interaction? What would
it take for other generations to trust online
transactions?

references
Ambrose, S., & Gelb, J. (2006). Consumer privacy
litigation and enforcement actions in the United
States. The Business Lawyer, 61, 2.
Ashworth, L., & Free, C. (2006). Marketing
dataveillance and digital privacy: Using theories of justice to understand consumers online
privacy concerns. Journal of Business Ethics,
67, 07-123.
Baratz, A., & McLaughlin, C. (2004). Malware:
what it is and how to prevent it. Retrieved November 11, from http://arstechnica.com/articles/
paedia/malware.ars
BJs Wholesale Club settles FTC charges. (2005).
Retrieved from http://www.ftc.gov/opa/2005/06/
bjswholesale.htm
Brooke, J., & Robbins, C. (2007). Programmer
gives up all the money he made distributing

0

spyware. Retrieved from http://www.ftc.gov/


opa/2007/02/enternet.htm
Chen, K., & Rea, A. (2004). Protecting personal
information online: A survey of user privacy
concerns and control techniques. The Journal of
Computer Information Systems, 44(4), 85-93.
Childrens Online Privacy Protection Act. (2000).
Enacted April 22, 2000. Retrieved from http://
www.epic.org/privacy/kids/
Congress Passes Safe Web Act 2006. (2007).
Retrieved January 31, 2007, from http://www.
epic.org
COPPA protects children but challenges lie
ahead. (2007). Retrieved from http://www.ftc.
gov/opa/2007/02/copparpt.htm
COPPA FAQs. (n.d.) Retrieved from http://www.
ftc.gov/privacy/coppafaqs.htm
Email Security and Anonymity. (2004). Retrieved
from http://www.anonic.org/email-security.
html
Federal Privacy Act of 1974. (n.d.). Retrieved from
http://www.usdoj.gov/foia/privstat.htm
Federal Security Information Act of 2002.
(2002). Retrieved from http://en.wikipedia.org/
wiki/Federal_Information_Security_Management_Act_of_2002
Fraud.org. (n.d.). Retrieved from http://www.
fraud.org/tips/internet/phishing.htm, http://www.
phishinginfo.org/
FTC v. BJs Wholesale Club, Inc. (2005). Filing
ordered May 17, 2005. Retrieved from http://www.
ftc.gov/os/caselist/0423160/050616agree0423160.
pdf
FTC.gov. (2006). Retrieved from http://www.ftc.
gov/bcp/edu/microsites/idtheft/consumers/aboutidentity-theft.html#Whatisidentitytheft

A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

FTC File No. 032 3221. (2004). Petco settles


FTC charges. Retrieved from http://www.ftc.
gov/opa/2004/11/petco.htm

The CAN-SPAM Act: Requirements for Commercial Emailers. (2004). Retrieved from http://www.
ftc.gov/bcp/conline/pubs/buspubs/canspam.htm

FTC File No. 062-3073. (2006). Xanga.com to pay


$1 million for violating childrens online privacy
protection rule. Retrieved from http://www.ftc.
gov/opa/2006/09/xanga.htm

Top Ten Ways to Protect Online Privacy. (2003).


Retrieved from http://www.cdt.org/privacy/guide/
basic/topten.html

FTC.gov Spyware. (2007). Retrieved from


http://www.ftc.gov/bcp/conline/pubs/alerts/spywarealrt.htm
Health Insurance Portability and Accountability
Act of 1996HIPPA (Kennedy-Kassebaum Act).
(n.d.). Retrieved from http://aspe.hhs.gov/admnsimp/pl104191.htm
Homeland Security Act, Cyber Security Enhancement Act enacted December 13, 2001. (2002).
Retrieved from http://www.govtrack.us/congress/
bill.xpd?bill=h107-3482
Medlaw.com. (2006). Retrieved from http://www.
medlaw.com/healthlaw/Medical_Records/8_4/
woman-pleads-guilty-to-se.shtml
Michael, P., & Pritchett, E. (2001). The impact
of HIPPA electronic transmissions and health
information privacy standards. American Dietetic
Association. Journal of the American Dietetic
Association, 101(5), 524-528.
Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217-233.
Richards, N. M. (2006). Reviewing the digital
person: Privacy and technology in the information age by Daniel J. Solove. Georgetown Law
Journal, 94, 4.
Summary of the HIPPA Privacy Rules by HIPPA
Compliance Assistance. (2005). Health & Human
Services, May.
Swartz, N. (2006). HIPPA compliance declines,
survey says. Information Management Journal, 40(4), 16.

USA Patriot Act of 2001 enacted October 23,


2001. (2001). Retrieved from http://www.govtrack.
us/congress/bill.xpd?bill=h107-3162
US Safe Web Act of 2006 enacted by 109th Congress March 16, 2006. (2006). Retrieved from
http://www.govtrack.us/congress/bill.xpd?tab=
summary&bill=s109-1608
Vasek, S. (2006). When the right to know and
right to privacy collide. Information Management
Journal, 40(5), 76-81.

AddItIonAl reAdIngs
37 States give consumers the right to freeze credit
files to prevent identity theft; consumers union
offers online guide on how to take advantage
of new state security freeze laws. (2007). PR
Newswire, July 16.
Anthony, B. D. (2007). Protecting consumer
information. Document Processing Technology,
15(4), 7.
Carlson, C. Poll reveals data safety fears. eWeek,
22(50), 29.
Chellappa, R., & Sin, R. (2005). Personalization
versus privacy: An empirical examination of the
online consumers dilemma. Information Technology and Management 6(2-3), 181-202.
de Kervenoael, R., Soopramanien, D., Hallsworth,
A., & Elms, J. Personal privacy as a positive experience of shopping: An illustration through the
case of online grocery shopping. International



A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation

Journal of Retail & Distribution Management,


35(7), 583.

trust through privacy practices. International


Journal of Information Security, 6(5), 323.

DeMarco, D. A. (2006). Understanding consumer


information privacy in the realm of Internet commerce: Personhood and pragmatism, pop-tarts and
six-packs. Texas Law Review, 84(4), 1013-1064.

New trend micro internet security products


strengthen personal information protection
and deliver enhanced performance. (2007). Al
Bawaba.

Erickson, K., & Howard, P. N. (2007). A case


of mistaken identity? News accounts of hacker,
consumer, and organizational responsibility for
compromised digital records. Journal of Computer-Mediated Communication, 12(4), 12291247.

Online privacy policies: An empirical perspective on self-regulatory practices. (2005). Journal


of Electronic Commerce in Organizations, 3,
61-74.

Kelly, E. P., & Erickson, G. S. (2004). Legal


and privacy issues surrounding customer databases and e-merchant bankruptcies: reflections
on Toysmart.com. Industrial Management+Data
Systems, 104(3/4), 209.
Kobsa, A. (2007). Privacy-enhanced personalization. Communications of the ACM, 50(8), 24.
Lauer, T. W., & Deng, X. (2007). Building online



Pan, Y., & Zinkhan, G. M. (2006). Exploring the


impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72(2), 131.
Smith, A. D. (2004). Cybercriminal impacts on
online business and consumer confidence. Online
Information Review, 28(3), 224-234.



Chapter III

Online Privacy, Vulnerabilities,


and Threats:
A Managers Perspective
Hy Sockel
DIKW Management Group, USA
Louis K. Falk
University of Texas at Brownsville, USA

AbstrAct
There are many potential threats that come with conducting business in an online environment. Management must find a way to neutralize or at least reduce these threats if the organization is going to maintain
viability. This chapter is designed to give managers an understanding, as well as the vocabulary needed
to have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees. With the advancements in computing technology, any and all conceivable
steps should be taken to protect an organizations data from outside and inside threats.

IntroductIon
The Internet provides organizations unparalleled
opportunities to perform research and conduct
business beyond their physical borders. It has
proven to be a vital medium for worldwide commerce. Even small organizations now rely on

Internet connectivity to communicate with their


customers, suppliers, and partners. Today, employees routinely work from areas beyond their
offices physical area. They regularly transport
sensitive information on notebook computers,
personal digital assistants (PDAs), smartphones,
and a variety of storage media: thumb drives, CDs,

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Online Privacy, Vulnerabilities, and Threats

DVDs, and even on floppies. It is not uncommon


for employees to work offsite, at home, or out of a
hotel room. Outside the office, they often use less
than secure Internet connectionsdial-up, cable,
Internet cafs, libraries, and wireless.
Organizations often employ portals to share
information with their stakeholders, however;
these portals are not always secure from would
be attackers. In order to protect the organization
from vicious and malicious attacks, management
needs to understand what they are up against.
Even if the organization does not conduct any
business on the Internet, they are still not out of
harms way. Viruses, Trojans, and spyware can
come from multiple sources; floppy discs, CDs,
thumb drives, and even from mobile phones. To
complicate the matter even more, the information
technology (IT) environment at many organizations has become obscurepartially due to new
regulations and industry standards. The standard
has changed, it is no longer enough to be secure
and protect the businesses assets, organizations
need to be able demonstrate that they are compliant
and that security is an ongoing concern; failure
to do so could leave them facing stiff penalties
(Forescout, 2007).
The purpose of this chapter is to address
some of the potential threats that come with
conducting business in an online environment.
The chapter highlights the relationship between
privacy and vulnerability and threats. It delves
into techniques that are commonly used to thwart
attacks and protect individuals privacy. In the age
of unrest and terrorism, privacy has grown even
more important, as freedoms are compromised
for security.
The news is loaded with stories about security
breaches. For example:
In May of 2007, the news of the TJ Maxx security
breach shook up the banking and retail industry.
At first it was estimated that hackers had downloaded at least 45.7 million credit- and debit-card
numbers; however, court filings indicated that



number was closer to 96 million. Estimates for


damage range from $216 million to $4.5 billion.
The breach was blamed on extensive cyber thief
activity within TJ Maxxs network from 2003
through June 2004 and then again from mid-May
2006 through mid-December 2006 (Schuman,
2007). However, others blame the breach on weak
wireless securityOu (2007) revealed that the
retailers wireless network had less security than
many people have on their home networks.
Another example is:
In April 5, 2002 hackers exploited vulnerabilities
in a server holding a database of personnel information on Californias 265,000 state employees.
The state responded, and the world listened.
California is one of the largest economies in the
world, bigger than most countries. The attack
included in its victims, the then Governor Grey
Davis and 120 state legislators. The breach
compromised names, social security numbers,
and payroll information. In response, the state
legislature enacted a security breach notification
law Senate Bill (SB) 1386.
To put this in perspective, if online privacy is
described in terms of a risk triangle, the three
corners are vulnerabilities, threats, and actions.
Where actions represent anything the organization can (and should) do to mitigate attacks.
Applications, like ships, are not designed and
built to sit in a safe harbor, they were meant to be
used in churning chaotic waters. It is important
to understand threats and vulnerabilities enough
to have a good idea to of what to expect, so that
strategies and tools can be put in place to mitigate
the consequences (Bumgarner & Borg, 2007).

vulnerAbIlIty
Software vulnerabilities are not going away, in
fact they are increasing. According to the Coor-

Online Privacy, Vulnerabilities, and Threats

dination Center at Carnegie Mellon University


(CERT, 2007) there was an average of over 10
vulnerabilities discovered every day in 2003
(3,784 in total). This number has jumped to over
5500 in the first nine months of 2007.
Software flaws have become the vulnerabilities of choice for attackers. Flaws cut across the
entire enterprise application stackincluding
Web and application servers, databases, and
operating systems. Dr. (Rear Admiral) Grace
Hopper (1906-1992), a highly respected and accomplished computer scientist indicated that all
software has problems and that it is impossible to
have a perfect system. She articulated this point
using the following exampleif the probability
of an individual module having an error in the
code was just one in a hundred (1%), and that the
system had several hundred modules; then the net
probability of an error for that system would be
100%. This observation is particularly relevant
in that most commercial software developers use
complex computer software program development
toolkits (SDK) to improve their productivity and
effectiveness.
Qualsys (2006), a security vendor, studied
over 40 months of data scans (September 8, 2002
to January 31, 2006) and identified nearly 1600
unique critical vulnerabilities from a total infestation of more than 45 million vulnerabilities.
The data scan showed more than 60% of critical
vulnerabilities were in client applications such as
Web browsers, backup software, media players,
antivirus, and flash.
The Third Brigade found that vulnerabilities
generally fall into one of the following categories
(Aberdeen Group, 2005):

Vulnerabilities caused by incorrect configured systems


Failure to turn off factory defaults, guest
accounts, outdated software
Failure to maintain anti-virus and spam
updates

Failure to change default values leaving


holes
Well-know bugs in system utilities
Poor or ignorant policy decisions
Unapplied vendor security systems patch;
Aberdeen states that 95% of attacks are
against known vulnerabilities for which
patches are available.

Vulnerabilities do not have to be broken program code; Norman (1983) indicated that errors in
system designs, which provoke erroneous entries
by users can also be considered as vulnerabilities
that can be intentionally exploited by attackers.
Individually and collectively vulnerabilities
can create major risks for organizations. Weak
policies and protection can result in the release
of personal private information (PII). The release
of PII is not the only the problem. Another issue
is that hackers can obtain important data and
modify it. Suddenly, there are additional names
on the preferred lists, payroll, and accounts payable; and outsiders could be given authority or
consideration that they are not entitled to. An
organizations strategic plans could be compromised. Additionally, the release of PII can weaken
the publics confidence in the organization, subject
the organization to litigation, large fines/reparation costs, and to rigorous investigations, as well
as oversight.

threAts
Threats are not the same as vulnerabilities; threats
are things that take can advantage of vulnerabilities. Security threats, broadly, can directly
or indirectly lead to system vulnerabilities (Im &
Baskerville, 2005). An analogy might be an army
fort surrounded by the enemy where someone accidently left the forts front gate wide open. The
open gate is a vulnerability and the threat is the
opposing force. Translating this analogy to datainformation, the vulnerability would be a poorly



Online Privacy, Vulnerabilities, and Threats

protected system, and the threat is the criminal


hacker community. In this case, poorly protected
could be construed to be any of a number of things
including absolutely no protection, software that
is not updated, inappropriately defined security
rules, and weak passwords.
In general, it is important to ensure that sensitive information and systems are protected from
all threats, both internal and external. Typically,
this is done by separating the systems from the
networks. However, this is not always possible;
with the advent of e-business there is a need for
organizations to share information.
For example: an organization gives its partners (A)
and (B) permission to look at its online schedule
(instead of calling a clerk as they had in the past).
This can create the opportunity for partner A to
look at (or modify) partner Bs data. If the data
is of a personal type, say medical, several laws
could easily be violated. If it is indicated in the
privacy policy that data/information is not shared,
the individual whose data is released may have
rightful cause to institute litigation.
Clearswift, a leading provider of content
security products, has categorized five major
message traffic threat types as: asset theft, disruption, repudiation, content abuse, and denial
of service.
Asset theft happens via spoofing or social
engineering; when an outsider pretends to be
an authorized user and requests information not
available to an unauthorized user. However, more
commonly, it is the sending of sensitive information inadvertently or by disaffected insiders.
Disruption is a common threat, which includes anything that keep users (and services,
i.e., e-mail, fax ) from doing what they are
suppose to do. Other workplace disruption can
include dissemination of personal, pornographic,
or non-business information.
Repudiation (denial) is concerned with either
party (sender or receiver) being able to declare



that an event did not happen. Techniques like


Diffie-Hellman Key Exchange permit digital
signatures, which provide assurance that the
message was actually sent and/or received by the
intended parties. Digital signatures are accepted as
evidence in a court of law. This is critical because
oftentimes parties involved in transactions do not
know each other.
Content abuse is similar in scope of repudiation, but is focused on the content of the message
and not whether it was sent or received. It deals with
issues between the sending and receiving parties
over what was sent and what was received.
Denial of service (DoS) and distributed DoS
(DDoS) results when a party is bombarded with
more messages than it can handle, causing the
system to use all its resources to handle non-legitimate traffic. This can happen by bombarding
the victims machine with thousands to millions
of messages so that it cannot respond to legitimate
requests or responds so slowly that it is effectively
unavailable. DOS attacks are considered violations of the Internet Architecture Boards (IAB)
Internet proper use policy concerning Internet
ethics passed January 1989 (often referred to as
RFC 1087; see http://tools.ietf.org/html/rfc1087).
In the U.S. (and many countries), DoS is a serious
federal crime under the National Information Infrastructure Protection Act of 1996 with penalties
that can include fines and imprisonment.

socIAl engIneerIng
It is not always a technical issuea perpetrator
can use chicanery and/or persuasion to manipulate
unsuspecting people into either revealing sensitive information (such as logon and password)
or compromise perimeter defenses by installing
inappropriate software or portable storage devices
(that are seeded with malware) on computer networks. For example, an approach of phishing is
to ask a user to fill out a simple fake online form.
The form itself asks almost no personal informa-

Online Privacy, Vulnerabilities, and Threats

Figure 1. Ploy to capture personal information

tion; instead, it pre-fills the form with some of


the info that the sender already knows about the
victim. It asks for a person to make up a login
name and a password. The criminal hackers know
that most people suffer from password overload
and tend to reuse the same passwords over and
over again. Figure 1ploy to capture personal
informationis a representative sample (taken
off the net November 4, 2007):

the formula (see Equation 1risk dollar value)


is straight forward, coming up with the values
and probabilities is not. The important issue is
not the devised dollar value, but what does the
asset really mean to the organization and how are
they going to use it?
Equation 1risk dollar value
Risk $ = Asset value * Threat * Vulnerability *
Impact * Likelihood * Uncertainty

rIsk
Risk is involved in everything, every process, and
every system. Operational risk is often defined
as the risk of loss resulting from inadequate or
failed internal processes, people and systems, or
from external events. Risk is one of those things
that no one can escape and is hard to define. In
general, risk is the probability of a negative outcome because some form of threat will be able
to exploit vulnerabilities against an asset. Many
define the value of a risk attack as: the value of
an asset times the probability of a threat times
the probability of an undiscovered vulnerability
times some impact factor (representing reparations) times the possibility of the event. While

Elimination of risk is categorically impossible; the best that can be hoped for to get it
under control. Even if it were possible, the cost
and scalability issues of risk avoidance have to
be weighed against the cost of the probable losses
resulting from having accepted rather than having
eliminated risk (Pai & Basu, 2007).
Qualys, Inc. (2006) analyzed a global data
pool of more than 40 million IP scans with their
product QualysGuard. Data analysis revealed
the six axioms of vulnerabilities. These axioms
are important because they help management
understand the nature of possible attacks and
why and how their data could be at risk of being
compromised. Qualys Inc. (2006) believes that



Online Privacy, Vulnerabilities, and Threats

Understanding the behavior of vulnerabilities is


essential to setting effective security strategy and
proactively implement security solutions.

of the outbreak. Automated attacks pose a


special hazard to network security because
they inflict damage swiftly with little time
for reaction.

The axioms of Qualyss Research:


1.

2.

3.

4.

5.

6.

8

Half-life: Is the average time it takes an


organization to patch (or apply a fix) to half
of the most dangerous vulnerabilities. The
2006 findings indicate a decrease in the
half life to 19 days (down from 30 in 2003)
on external systems. They found that the
exposure of unpatched systems continues
during the significantly long period of halflife dissipation and increases as the severity
decreases.
Prevalence: Prevalence is the degree to
which the vulnerability poses a significant
threat. They found that half of the most
prevalent critical vulnerabilities are replaced
by new vulnerabilities each year. This means
there is ongoing change to the most important
threats to our networks and systems.
Persistence: The life spans of some vulnerabilities are unlimited as soon as the current
infection is addressed, a variant may appear.
In one day Sophos found over 300 variants of
the Stranton virus. The risk of re-infection
can happen during deployment of machines
with a faulty unpatched operating system
Focus: The 2006 study data revealed that
90% of vulnerability exposure is caused by
10% of critical vulnerabilities.
Exposure: The time-to-exploit cycle is
shrinking faster than the remediation cycle.
Eighty percent of critical vulnerability exploits are available within the first half-life
after their appearance. Since the duration
of vulnerability announcement-to-exploitavailability is dramatically shrinking, organizations must eliminate vulnerabilities
faster.
Exploitation: Nearly all damage from automated attacks is during the first 15 days

Cannon and Kessler (2007) believe that the


rapid increase in breaches and incidents can be
directly related to technology. They indicate that
the increase in 1) computer processing power and
data storage capacity and in 2) higher data transmission bandwidth have acerbated the problem.
This in conjunction with the massive connectivity
of information systems afforded by the Internet
and World Wide Web allow for the mass collection
and misuse of sensitive personal data.

electronIc rIsk mAnAgement


There is a large group of people that believe that
in the final analysis of security breaches, that most
problems should not be blamed on hackers or malicious employees, instead the instances should be
blamed on lack of common sense. To them, the
vast majority of breaches can be classified under
the title of carelessness. As in people not paying
attention to what they are doing, such as putting
a letter in the wrong envelope, adding a copy
to an e-mail, or losing equipment or hardware,
the real culprit is a lack of following procedures
(Padilla, 2007).
However, regardless of how breaches are
caused: by ignorance, carelessness, inside users,
or criminal hackers, there are a lot of them. The
Privacy Rights Clearinghouse (2007); indicates
that more than 48 million records containing
sensitive personal information have been involved
in some form of a security breach in just January
2007 alone. Cannon and Kessler (2007) define
A data breach as the unauthorized access to and
acquisition of data in any form or format containing sensitive information that compromises the
security of confidentiality of such information and
creates a reasonable risk of its misuse.

Online Privacy, Vulnerabilities, and Threats

IDC indicates that strong corporate governance is the foundation of successful protection
of corporate assets from a wide variety of threats
(CNET, 2004). To that end, organizations need to
establish, educate, and enforce their policies to
effectively ensure the protection they need.
1.

2.

3.

Establish: Establish clearly written policies


and procedures for all employee communications. The rules must deal with acceptable
and unacceptable behavior for the Internet,
P2P (peer-to-peer), e-mail, IM (instant messaging), and blogging.
Educate: Educate and support written rules
and policies with company wide training.
The employees need to understand that the
policy is a living document (it will mutate as
new threats and issues arise) but compliance
is mandatory. This can be a critical issue for
an organization because misuse (whether
deliberate or accidental) can result in the
organization being held responsible by the
legal principle of vicarious liability.
Enforcement of written rules and policies
Enforce policies: With a combination of
disciplinary action and software. If there
is any doubt about employee willingness
to adhere to the organizations usage and
content rules, consider applying a technological solution to the people problem.
Tools can help the installation of hardware,
software, and/or appliances, and enforce
established policies. The organization can
block access to inappropriate sites and stay
on top of employees online activity. Failure
to discipline employees for e-mail-related
misconduct may encourage other employees to abuse the system and could create
liability concerns for the organization. It is
important to communicate the policies and
to adhere to them. The American Management Association (2005) Electronic Monitoring & Surveillance Survey found that
most companies monitor employee Web site

usage (76%) and use filters to block access


to inappropriate Web sites (65%). Slightly
more than a quarter (26%) of responding
organizations indicated they went further
admonishing individuals, they terminated
them for misuse of e-mail or the Internet.
The World Bank indicates that to reduce the
e-security risk, day to day augmentation of e-security internal monitoring and processes are needed.
They indicate that proper Risk management is
achieved through a comprehensive checklist per
the cyber-risks that affect the network as a whole.
They have refined a technology risk checklist
based upon standards set by ISO 17799 (Glaessner,
Kellermann, & McNevin, 2004).

mAlWAre
The term malware (malicious software) is typically used as a catch-all to refer to a variety of
forms of hostile, intrusive, or annoying software
designed to infiltrate or interrupt services from
a single computer, server, or computer network
without the owners informed consent. The term
malware includes all types of trouble makers: such
as: viruses, worms, kiddy scripts, Trojan horses,
and macro (scriptcontext) viruses. Malware seeks
to exploit existing vulnerabilities on systems.
Malware can utilize communication tools to
spread and oftentimes it goes unnoticed. McAfee
Avert Labs (Bernard, 2006) has recorded more
than 225,000 unique computer/network threats.
In just 10 months between January and November
of 2006, they found 50,000 new threats. Google
researchers (as part of the Ghost in the Browser
research) warned that one in 10 Web pages is
hiding embedded malware (Provos, McNamee,
Mavrommatis, Wang, & Modadugu, 2007).
The term malware is often associated with the
characteristic attributes of a virus; self-replicating,
something that embeds itself into other programs,
which in turn can infect other programs. The no-

9

Online Privacy, Vulnerabilities, and Threats

tion of a self-replicating program is not new, it


dates back to John von Neumanns 1949 lectures.
Neumann postulated the theory that a program
could reproduce itself. Nearly 35 years later, November 1983, Professor Fred Cohen substantiated
Neumanns work by creating and demonstrating
the first computer virus in a computer security
seminar. The name virus was provided by Len
Adleman (the A in RSA); (http://en.wikipedia.
org/wiki/Malware) and (http://all.net/books/virus/part5.html).
In 1989, John McAfee (of McAfee Avert Labs)
defined a virus as a computer program created to
infect other programs with copies of itself. It is
a program that has the ability to clone itselfso
that it can multiply and constantly seek new host
environments (McAfee & Haynes, 1989). Today,
not all computer viruses inject themselves into
their victims, nor is cloning considered mandatory. Researchers now make distinction between
different malware varieties based on whether it
is considered viral or non-viral malware (Cafarchio, 2004):

0

Viral malware typically replicates rapidly


and fairly indiscriminately, its behavior has
a very visible impact. Viral infections might
be used as part of distributed denial of service attack; worms like Code Red are able
to spread worldwide in a matter of hours.
Non-viral malicious software does not replicate. It is planted by hackers, or unknowingly downloaded by unsuspecting users,
or foisted on systems as part of a software
package to track the users behavior and/or
software usage. Non-viral malicious software is designed to be inconspicuous and
stealthy. These types of infections can go
undetected for long periods of time.
There are some virus types of malware that
are design merely to harass the users and not
to intentionally damage files or the operating
systems. Malware like the Bearded Trojan
are of this style. The Bearded Trojan displays

a nude female and while it is potentially


offensive or embarrassing, it often makes
people realize that they are vulnerable and
could have been infected with a virus that
acts as a key logger, or a Web bot (Harley,
Slade, & Gattiker, 2001).
Another example of a non-viral virus is the
ANSI bombs; thankfully, they are not common and they do not reproduce. An ANSI
bomb is a sequence of characters that is
meant to redefine key(s) on a keyboard. Thus,
when the user presses a key the normally
assigned characters for that key are not sent
to the terminal or computer, but rather the
redefined string. This string may contain any
ASCII characters, including <RETURN>
and multiple commands. A function key or
even the space bar could be assigned a string
that invokes a program to do something the
user does not want to happencopy, delete,
or send material (Harley et al., 2001).

Adware/spyware
A particular annoying and dangerous form of
malware is adware/spyware. The terms are communally used as interchangeably. The goal of this
technology is to gather information without the
target persons knowledge or permission. This
type of software is used to watch and record
which Web sites and items on the Internet the
user visits in hopes of developing a behavioral
profile of the user that can later be exploited.
The slight difference between the two terms is
the intent of the software agent. Adware has an
advertising aspect in the information it collects,
while spyware tracks and record user behavior
(in the traditional sense of the word spy).
The problem with spyware is that users
typically store all sorts of sensitive and personal
information in their machines that should not be
made public. Some information is protected by
law, trade secrets, and financial data. The loss of
personnel and customer information could wreak

Online Privacy, Vulnerabilities, and Threats

havoc for the organization. Additionally, the


theft of stored information such as: bank account
numbers, credit card numbers, social security
numbers, and pictures could also devastate the
individual.
Another thing that makes adware/spyware
so pernicious is that anti-viruses & firewalls are
not very effective against them. While a good
anti-virus program (AV) is an absolutely essential
for any machine, even those that do not connect
to a network (especially if the machine accepts
removable media), it is not enough. AV software
will not protect user machines from spyware.
Viruses and spyware have different properties;
because spyware does not propagate itself like
other forms of malware, it is not likely to be
detected by traditional AV methods.

botnets
Vent Cerf, one of the founding fathers of the
Internet, believes that one in four computers (approximately 150 million out of 600 million) connected to the Internet are compromised and likely
to be unwilling members of a botnet (Fielding,
2007). These machines are often used as proxies
for illegal activities like spamming and credit card
fraud. Botnets have been a growing problem on
the Internet since at least 2002. A bot (short for
robot) is a software agent released onto a computer
connected to the Internet. The bot can download
malicious binary code that compromises the host
turning it into a zombie machine. The collection
of zombies is called a botnet. The servers hosting
the bot binaries are usually located in countries
unfriendly to the United States. The bots are
transparent and run in the background. Bots can
open a channel to a bot controller machine,
which is the device used by the perpetrator (the
bot herder) to issue commands to the bots (Baylor
& Brown, 2006).
Bot herders typically use bot controllers to
harvest user accounts via screen-capture, packet-

sniffers, and key-logging techniques. They are


routinely used for phishing (some estimate that
over 70% Internet spam is due to them), click
fraud, and malware.
Botnets can be used for attacking various Web
sites by unleashing a barrage of requests against
the site, so that the victim site spends more time
processing the requests coming at it, than it does
doing the job it was intended for. These attacks
employ a technique known as distributed denial
of service (DDoS) attack. The idea behind the
DDoS is to flood a machine, server, or cluster
faster than the server can respond to them. DDoS
chew-up bandwidth and resources, effectively
shutting down the attacked network. In this manner a large botnet can wield an amazing amount
of power. If several large botnets are allowed to
join together, they could literally threaten the
national infrastructure of most countries. On
April 27, 2007, a series of DDoS attacks crippled
the financial and academic Web sites in Estonia
(Kirk, 2007).
Botnets are no longer just annoying, spampumping factoriesthey are big business for
criminals. This shift has awakened large businesses, which historically have either looked the
other way or been in denial about bots infiltrating
their organizations (Higgins, 2007).
The Anatomy of a Typical Botnet Attack

Step 1: The bot herder loads remote exploit


code onto an attack machine that might
be dedicated to this purpose or an already
compromised machine. Many bots use filesharing and remote process control (RPC)
ports to spread.
Step 2: Attack machines scan for unpatched
(not current with updates) victim machines
to launch attacks against.
Steps 3 & 4: The victim machine is ordered
to download files (binaries) from another
server (frequently a compromised web or
FTP server).



Online Privacy, Vulnerabilities, and Threats

Step 5: These binaries are run on the victim


machine and convert it to a bot. The victim
machine connects to the bot controller and
reports for duty.
Step 6: The bot controller issues commands
to the victim to down load new modules,
steal account details, install spyware, attack
other machines, and relay spam.
Step 7: The bot herder controls all bots by issuing commands via the bot controller(s).

uses an IRC approach, it does not scale very


well and is unlikely to rival reach Storms size.
Rbots underlying malware uses a backdoor to
gain control of the infected machine, installing
keyloggers, viruses, and even stealing files from
the infected machine, as well as the usual spam
and DDoS attacks. The real scary part is that
Rbot [malware] is readily available to anyone who
wants try to apply some kind of criminal activity
in the bot arena (Higgins, 2007).

Worlds biggest botnets

Whose fault Is It?

Storm

The answer to this question depends on who you


ask. It can easily be argued that it is the users
fault. If the user keeps their antivirus up-to-date
and stays away from traditional types of sites that
harbor malware (celebrity), the problem should
be lessened. However, variants of viruses have
been tracked in the hundreds per day; it is hard to
keep current on protection when there is a whole
industry working against you.
Since it may not necessarily be the user then
it must be the developers, or the publisher for
not creating a product that cannot be usurped.
Unfortunately, there are highly skilled, university
trained hackers that strive to develop the right
code. After all, there is really only one reason for
botnets: and that is to make money. Some people
blame law enforcement or government for not
quick prompt and decisive action. However, many
of the bot herders are in countries in which the
U.S. does not have jurisdiction. Politicians can
pass laws, but never be in the position to have
them enforced.
To that end, in 2007, Senators Orrin Hatch
(R-Utah) and Joseph Biden, Jr. (D-Delaware)
introduced the Cybercrime Act to update existing laws and close what they say are loopholes
that online criminals can exploit. The bill takes
a multifaceted attack. It lowers the threshold of
evidence, it address not only damaged computers
but also to individuals. It prohibits the creation

There is a new threat, that of the super botnet.


While few agree on the actual size of these botnets, they are huge; where the number of active
members per 24 hour period (not just attached
zombies) of the net can be in the hundreds of
thousands. Currently, the largest of the new breed
of botnets is Storm. Storm broke away from
the mode and uses a decentralized peer-to-peer
(P2P) communication, instead of the traditional
centralized Internet relay chat (IRC) model. The
P2P makes it tough to track and tougher to kill;
you cannot render it mute by disabling one or two
central control machines.
Storm uses a complex combination of malware,
which includes worms, rootkits, spam relays, and
Trojans. It propagates via a worm or when a user
visits an infected site or clicks on a link to one. It is
very stealthy, it employs a balance use approach
and a fast-flux. The purpose of fast-flux is to
circumvent the IP-based black list technique (see
black list). It does this by rapidly rotating DNS
records to prevent discovery (Higgins, 2007).

Rbot
Rbot is generally considered the second largest
botnet. It employs an old-style communication
structure using Internet relay chat. Because it



Online Privacy, Vulnerabilities, and Threats

of botnets that could be used in online attacks. It


makes the threat of revealing (extortion) confidential information illegally obtained from computers
a crime (Savage, 2007).

botnetsfbI operation bot roast


In the second week of November 2007, John
Schiefer of Los Angeles, California agreed to
plead guilty to felony charges for building and
using a botnet as large as 250,000 nodes to steal
personal identifying information (PII). The botnet was used to invade individuals privacy by
intercepting electronic communications being
sent over the Internet from the zombie computers
to PayPal and other Web sites. Later, data mining techniques were used to garner PII such as
username and passwords. With the usernames
and passwords, they accessed bank accounts to
make purchases without the consent of the true
owners. The botnet was also used to defraud a
Dutch advertising company. This was the first U.S.
prosecution under the U.S. federal wiretap statute
for conduct related to botnets (Wilson, 2007).
The FBI and Department of Justice in an antibotnet sweep label as Operation Bot Roast has
arrested three individuals for assembling botnets.
They are charged with felonies. One of the three
arrested is alleged to have used a large botnet
network to send tens of millions of unsolicited email messages. Another is charged with infecting
more than 10,000 computers worldwide, including two Chicago hospitals. The bots caused
the infected computers to, among other things,
repeatedly freeze or reboot, causing significant
delays in the provision of medical services. It
took the hospitals more than 1,000 man-hours
to clean up after the infections (Keizer, 2007;
Albanesius, 2007).
The government is working in conjunction
with industry partners to uncover these schemes.
These include the CERT Coordination Center at
Carnegie Mellon University as well as Microsoft,

and The Botnet Task Force, (a low-profile organization initiated by Microsoft in 2004 that acts
as a means of building awareness and providing
training for law enforcement).
In the process, the FBI has identified more
than 1 million hijacked personal computers. The
majority of victims are not aware that their computers have been compromised or their personal
information exploited. The FBI said because of
the widely distributed abilities of botnets they not
only harm individuals but are now considered a
threat to national security, as well as the information infrastructure and the economy.

seArch engInes
A problem with most search engines is that they
are ambivalent to content permissions. Certain
individuals (such as the head of payroll) may
have permission to view all of the companys
information. While other individuals (such as the
head of personnel) are limited in the type of data
are allowed to see. An employee may be given
permission to see their own information but not
that of the person working next to them. There
may also be certain individuals that are not allowed to see any information at all. Because search
engines typically can not take data ownership and
coordinate it with user permissions, problems can
arise when responding to a request.
When implemented carelessly, search engines
have the potential to uncover flaws in existing security frameworks and can expose either restricted
content itself or verify the existence of hidden
information to unauthorized users (Vivisimo,
2006). In this regard, poorly implemented search
engines could release large amount of personal
identification information. Imagine typing the
name of the CEO in a search engine and receiving a page that lists his personal phone number,
salary, and home address.



Online Privacy, Vulnerabilities, and Threats

WIreless medIA
Organizations may think their mobile workers are
safe with their new wireless notebooks, but recent
WLAN tracking at the RSA security conference
showed a multitude of vulnerabilities. Some
common faults were that many users were using
hotspots, but had no idea who was sponsoring
the ports. In some cases, it was discovered that
the users were actually talking to other local
computers that also had their connections active
(Shaw & Rushing, 2007).
Wireless devices often remember the last
good site they were connected to and attempt to
use them first. Which means that if the user did
not shutdown the port (disconnect from a hot spot
correctly), the computer will look for that spot first,
even if there is a more secure connection available. Another issue is that the port will continue
to actively search for a signal. A critical situation
can arise if the user forgets to disable the wireless
card, and then plugs his/her device into a wired
network. A couple of things could happenthe
network will see the other port and might adjust
its routing information to accommodate it, in
the process it could bypass firewalls and border
security. Another thing that may happen is the
device might also connect to another device via
the wireless port, again bypassing some security,
but elevating the permissions and authority of the
newly connected user to that of the legitimate user.
In either case, the result is a huge hole in security
(Shaw & Rushing, 2007).
Organizations are paying a very high price
for wireless management. The Aberdeen Group
estimates that it costs nearly 10 times more to
manage wireless services and devices compared
to wired-lines (Basili, 2007). In spite of that,
Aberdeen found that 80% of respondents were
planning increases in mobile wireless access.
The RSA Conference is an event that draws
thousands of computer users. Many of them
bring their wireless laptops (and other devices).
AirDefense (2005), a wireless security company,



credited by many as the founder of the wireless


security industry, found that more than half of
the 347 wireless devices it monitored during
conference were susceptible to attack. What is
truly amazing is not that it happened once, but
just 2 years later it happened again at another
RSA conference. AirDefense once again found
that more than half of the wireless devices at the
conference network were themselves unsecured
and were vulnerable to attacks; thus leading to the
conclusion that the people responsible for protecting enterprise data were not doing a very good job
of protecting their own assets (Cox, 2007).

telephones
Wireless telephones with computer-enabled features (such as e-mail and Internet access) have
been compromised; Trend Micro Inc. announced
it had found security flaws on MS Windows
Mobile, a popular operating system used in the
smartphone. Many individuals that used these
devices are executives who routinely access sensitive information. In this case, the main risk is not
malware, but the risk of lost devices.

mobile encryption
The news regularly repots that laptops with
thousands of sensitive records on customers or
employees are lost or stolen each month. Organizations know the risks and the threats. These threats
are easy to understand but most organizations do
not allocate the resources necessary to protect
themselves. Encryption is an effective safe guard
for most mobile devices, and one that will relieve
some of the legislative pressures. However, it is
far from being fully adopted; a survey by Credant
(see McGillicuddy, 2006) asked respondents to
list reasons why their companies had not adopted
encryption for mobile devices.

56% indicated it was due to a lack of funding;

Online Privacy, Vulnerabilities, and Threats

51% said encryption was not a priority;


and
50% said there were limited IT resources; in
other words: No one wants to pay for it.

Mobile devices are often seen as low-powered,


low-capacity corporate tools. To which there is
considerable fear that encryption will add little,
but in the end will slow them down. Critics cite
that the idea behind mobile devices is to make
the user more productive by added convenience.
Anything that slows down the devices would
ultimately detract from the users productivity.
Additionally, encrypted devices are harder to
diagnose, repair, and recover. However, these
concerns are more applicable to older less powerful devices (McGillicuddy, 2006).

dAtA
Organizations accumulate a wide breath of data,
that if stolen could potentially hurt the enterprise.
Loss or theft of confidential information: such
as blueprints and engineering plans, tenders,
budgets, client lists, e-mails and pricelists, credit
card and other financial information, medical or
other confidential personally identifiable records,
classified, restricted or personal information,
scripts, storyboards, source code, database
schemas, or proprietary trade secrets can severely
impact the integrity and profitability of a corporation. This risk is amplified by the prevalence of
portable computing devices as a part of normal
business activities and by the increasing levels of
online transactions that occur routinely (GFI-2,
2007).
Fundamentally, there are two types of security.
The first type is concerned with the integrity of the
data. In this case the modification of the records
is strictly controlled. The second type of security
is the protection of the information content from
inappropriate visibility. Names, addresses, phone
numbers, and credit card details are good examples

of this type of data. Unlike the protection from


updates, this type of security requires that access
to the information content is controlled in every
environment.
The Internet makes it easy for organizations
to collect personal identifying information, such
as: names, addresses, social security numbers,
credit card numbers, or other identifiers (Shimeall,
2001). If this information were disclosed inappropriately, it would put these individuals at risk
for identity theft (Wang, Lee, & Wang, 1998). To
guard against such an outcome, laws worldwide
have been passed to aid in data protection.

the threat from Within


Within the U.S., the Gartner Group estimates that
70% of all unauthorized access to information
systems is committed by employees. The CSI/FBI
survey found that 68% of respondents claimed
losses due to security breaches originating from
insiders (Gordon, Loeb, Lucyshyn, & Richardson, 2006). Of course, the magnitude of insider
malfeasances depends somewhat on how one
slices and dices the numbers. The U.K. Scotland
Yard Computer Crime Research Center, (2005)
found that 98% of all crimes committed against
companies in the U.K. had an insider connection.
In the USA, surveys conducted by the U.S. Secret
Service and CERT coordination center concluded
that: Respondents identified current or former
employees and contractors as the second greatest
cyber security threat, preceded only by hackers
(Keeney, Kowalski, Cappelli, Moore, Shimeall,
& Rogers, 2005).

endpoInt (perImeter-bAsed)
securIty
The term endpoint, as its name implies, is any
place that a device can interact with another device.
Generally speaking, an endpoint is an individual
computer system or device that acts as a network



Online Privacy, Vulnerabilities, and Threats

client and serves as a workstation or personal


computing device. Endpoints are often mobile and
intermittently connected and in the mobile society,
they are becoming indistinguishable (Forescout,
2007; Endpointsecurity, 2004).
Laptops have become so popular they have
almost caught up with desk top machines, as
office use goes (40% to 45%CSI/FBI survey,
2006). Because laptops are not tethered to the
desk, they are routinely out of the protection of the
organizations network. Additionally, if removable
media (floppys, CDs, DVDs, flash drives) are
used on laptops or office machines, they are an
easy entry point for malware. A further security
concern is the construct of engineering devices
for easy maintenance. These easy maintenance
devices can allow a person to literally remove the
internal hard drive from a laptop in less than a
minute and make off with all of the private data
that is in the machine.
Endpoint security is the total measures taken
to implement security sending and receiving data.
These measures include assessing the risk to the
clients antivirus and personal firewalls, as well as
protecting the network from themselves. Endpoint
security logically extends to the management and
administration of these security measures. It also
deals with risk, reporting, and knowledge management of the state and results of these measures
(Positive NetworksEndpoint security).

endpoint components
Firewalls
In general terms, a firewall is software or a
hardware device that controls the flow of traffic
between two networks or entities. A packet filter
firewall works by inspecting the contents of each
network packet header and determining whether
it is allowed to traverse the network. There are
basically three types of firewalls: packet filter,
stateful inspection, and application proxy.



In the case of a personal firewall, it controls


the network traffic between a computer on one
side, and the Internet or corporate network on
the other side. A firewall is a network (hardware
& software) node that isolates a private network
from a public network. The firewalls job is to
keep unwelcome traffic from the Internet out of
the computer, and also to keep in the traffic that
you do not want leaving the computer. To that
end, organizations may have several firewalls to
create barriers around different layers of their
infrastructure. Firewalls are often compared to a
bouncer at a nightclub: they are located at the
point of entry; they enforce rules to determine
who gets in (and out); and they inspect all that
passes through the doors they are guarding. With
a layer approach, it is possible that a firewall can
insure that even if a password is compromised
an intruder will only have restricted access to
the network.
However, firewalls are neither the first nor the
last word in endpoint components. Hardware and
software firewalls have a serious flaw in that they
typically do not look at the contents of a packet;
they only look at its headers. As written earlier,
antivirus software is not very effective against
spyware, the same is true with a firewall.

preventIve meAsures
The open nature of PCs in most organizations
has resulted in users installing a wide variety
of applications that they use to get through their
day, and several that they should not. Some IT
managers attempt to prohibit the use of unauthorized peripherals (removable media) and
applications with the hope that this process will
shut out malware. The usage of portable devices
at work could impact corporate network security
through the intentional or unintentional introduction of viruses, malware, or crimeware that can
bring down the corporate network and or disrupt
business activity.

Online Privacy, Vulnerabilities, and Threats

Even with the tightest security net, it is possible


for a destructive breach to occur. Failure to implement a security audit process to meet government
regulatory requirements can result in significant
fines, in addition to the possibility of imprisonment. The risks are real and affecting businesses
on a daily basis (Juniper Research, 2006).
Further, not only are devices a threat to data
and machine integrity, but also to worker productivity. An employee can use company hardware
and software to enhance digital photos, play
computer games, or work on freelance projects.
The control of USB (universal serial port) ports
can limit unauthorized use and prevent intentional
or accidental attacks against a companys network
(Muscat, 2007). Control of the USB ports can be
made either programmatically or by physically
locking & blocking them (PC Guardian).
Additionally, there are emerging technologies
that monitor the movement of data and enforce
actions on the data based on company policies.
These products from vendors such as Orchestria
and Vericept work at the network and desktop
levels, and can monitor movement, as well as
prevent data from being copied from the originating application to external sources, such as
USB drives.
Another approach relies on detecting the departure of an authorized user. A wireless USB
PC Lock will lock and unlock a PC based on a
users proximity to the machine. A small transmitter is carried by the user, if s/he is more than
two meters away, the machines screen will be
programmatically locked, once the user returns
in range the screen is unlocked.

caveat is that this list is not for corporate users,


it is for the home user. For the home user, the
advice is simple:
1.
2.
3.
4.
5.
6.
7.
8.
9.

Get a good antivirus package and keep it up


to date.
Let your system download system updates
(patches) from a trusted site.
Deactivate Active X components.
Do not install items from unknown sources.
Do not open e-mails from people or organizations that you do not know.
Never click on an embedded e-mail link;
copy it or use a book mark.
Be extremely careful about what sites you
visit.
Strangers that send you mail, want something!
You will not win something if you did not
enter.

In an organizational environment, the mentioned still applies. However, the user is usually
burdened by user names and passwords. The
number one suggestion is pick a strong password
and do not share it with anyone for any reason.
If you need to have multiple sign-ons, tailor the
passwords for each application. For example your
password for accounts payable may begin with
AP. The easiest way to pick strong passwords is
to create an acronym out of your favorite song
lyrics. Take the first letter of each of the first 12
words, your application code and some important
number, like the middle digits of your first home
address.

the end user


the human in the equation
While the chapter is aimed at management, we
would be amiss if we did not describe some things
that the end user can do. This list is far from
complete and some may argue about the order of
which items are presented. They might also point
that import suggestions have been admitted. The

According to CompTIAs IT security survey, human error, either alone or in combination with a
technical malfunction, was blamed for 74% of
the IT security breaches (Cochetti, 2007). Human
involvement in systems is not limited to making



Online Privacy, Vulnerabilities, and Threats

errors; during the day users often take breaks to


surf the Web, e-mail, or IM their friends.
However, Web surfing can do more than
relieve stress and waste time; it can expose users and organizations to dangerous Web sites,
data leakage, and e-mails with inappropriate or
dangerous content. Further, it can lead to installation of non-authorized software, which besides
prompting civil and criminal investigations, can
introduce piracy robbing malware. This type of
publicity has a negative impact on the bottom line.
To protect themselves, organizations should abide
by a strong user access policy (Shinder, 2007).
Instant messaging (IM) has begun to be embraced by organizations because it provides a cost
effective means to electronically communicate
both synchronously and nearly instantaneously.
IM presence awareness and permission-based lists
give the perception of a low risk of receiving spam
or other unwanted messages. The rapid adoption
of public IM services (such as AOL, MSN, and
Yahoo) has raised serious concerns about security
risks and compliance with regulatory requirements. IM and e-mail can be used as a tool for
deliberately disseminating private information;
or it may provide a channel that could inadvertently admit spyware, worms, or viruses. Since
instant messaging involves free electronic communication with internal employees and anyone
designated as a trusted colleague, unauthorized
information dissemination may proliferate via
unmonitored (Webex, 2006).
Roger J. Cochetti, group directorCompTIA
U.S. Public Policy states security assurance
continues to depend on human actions and knowledge as much, if not more so, than it does on
technological advances. He indicates that failure
to follow security procedures (human error) was
blamed by more than 55% of the organizations
as the factor that contributed the most to security
breaches (Cochetti, 2007).

8

lIstIngWhIte, blAck, And


grAy
Listing is a response to malwares continuous
mutation of their signatures, which results in a
continuous flow of zero-day attacks. The basic
idea is to restrict execution of programs based
on a list. Listing comes in three distinct styles:
white, black, and gray.
White listing basically consists of allowing users/workstations to run only software that has been
pre-approved by the organization. Implementing
this approach requires conducting exhaustive inventory of all applications in use as well as their
version. Once the inventory is completed, each application must be reviewed to ensure it is required.
After the review, the software implementations
and versions need to be made consistent across
the protected network segments.
Black listing is the opposite of white
listing. Workstations are prevented from running applications or visiting Web site that are
specifically listed. Thus, sites that are found to be
perpetrators of malware and spam are banned
from user activity. While this may seem to be
a viable approach for the business managers, it
is weak, and can be very risky, if not supported
by additional controls. A missed module can
be disastrous. Further, new malicious or highly
vulnerable applications are created or identified
faster than they can be placed on a blacklist.
Gray listing is a conditional blacklist, and
has a high risk of false positives, blacklisting
someone by mistake.
Hybrid listing is a combination of features
that combine the features of white, black, and gray
listing. It is designed so that management can approve some software and ban other software that
is not needed or wanted, thus preventing the first
execution of any new unknown software. Because
the hybrid approach prevents the first execution,
not the installation, the approval/authorization
process can be centrally managed in real time.

Online Privacy, Vulnerabilities, and Threats

Figure 2.

Browser-based listing relies on a modern


browser to check that the site a user is going
to is not a forgery. One option downloads a list
of known Web forgeries (see Figure 1ploy to
capture personal information): but this technique
only offers real protection for a few moments after
it is downloaded. Another technique would be to
have the browser check with an authority (such as
Google) each time a URL or site is entered.
Mozilla indicates that users can protect themselves from Web forgeries by:

That instead of following links from a e-mail


to banks or online commerce sites, always
either type the Web page address in manually or rely on a bookmark;
They also recommend using a Password
Manager to remember passwords instead
of entering them manually; and
They recommend using an e-mail product
that will detect and alert the user about
suspect web sites.

least privilege Authority


In an organizational environment, the information
systems/information technology group struggles
to give users the access they need and want, while

attempting to ensure that security is not sacrificed.


Programs that perform useful functions of work
are known as applications. Applications need
certain capabilities to create, read, update, and
delete datathese privileges often go by the acronym CRUD. Applications need access to certain
services that are only granted access through the
operating system or the system administrators:
such as scheduling new tasks, sending information
across applications, and verifying passwords. In
order for that to work the application/user needs
to be at a high enough level of trust (permissions/privileges/authority) so that they know what
they are doing.
With the principle of least privilege, the goal is
to give users only the minimal access and privileges they need to complete the task at hand. In
most cases this will be for the entire logon session,
from the time they logon in the morning till they
leave for the night. The concept of principle of
least privilege is a prophylactickind of a safety
belt; if the machine is not attacked by malware,
it is not necessary and does no harm; but if it is,
its an extra layer of protection. Therefore, the
construct of least privilege is becoming a common phrase as organizations scramble to protect
network assets and resources.

9

Online Privacy, Vulnerabilities, and Threats

vulnerability management
The process of patch management can be complex, difficult, and is often sacrificed when an
organization is in a crisis mode. If shortcuts
are taken, they will almost always comes back
to haunt the organization. Patching in the programming has long been defined as trading an
error that is known for one that is unknown. It
is not the thing to rush through. Vendors spend
considerable time researching vulnerabilities and
devising repairs or work-arounds. Many of the
repairs are dependent on updates being already
applied. Failure to stay current on updates is one
of the main reasons that enterprises struggle with
bot infections (Symantec).
Patching is a trade off between the time required to repair a problem responsibly and completely versus the hackers window of opportunity
to exploit a specific vulnerability. Vulnerability
management has become a critical aspect in
managing application security. Patching vulnerabilities (depending on the severity) can be a time
consuming job. To do it safely, the patches should
be applied and tested in an isolated environment
against a copy of the system.

0

New components of distributed architectures: Standardization and plug-and-play


are not always positive, they come with a
price. Standardize code makes it easier for
all involved the developer and the criminal
hacker. Each module represents a unique
addressable attack pointa target at which
criminal hackers can aim their exploits.
Multiplying network access points can act
similar to an open wound, if one is not careful, it will allow in all sorts of viruses and
the like. With organizations opening their
networks to suppliers, clients, customers,
employees, and contractors, security has
become a mandate. Multiple entry points
have raised the importance of controlling
the traffic that comes and goes through the

network. Within this regards, firewalls and


antivirus products are important parts of an
effective security program.
Wireless network access points bring
their own set of issues. With wireless, the
perimeter (endpoint) security is critical. It is
important to have IDS (intrusion detection
system) and to monitor all traffic.
Simply relying upon firewalls and antivirus is not an effective strategy. Understanding the network and understanding
its weaknesses (vulnerabilities) can provide
insight on how to manage and protect critical
data.

conclusIon
No matter how hardened a network perimeter is,
there are a number of weaknesses that can allow
breaches to occur. It is usually recommended that
a layer defense approach be adopted to strengthen
protection. However, care needs to be taken that
additional layers actually add protection instead
of just protecting against the exact same vulnerabilities or threats. Reckless implementation or
selection of software may not produce the desired
outcome. A layered approach may be more like
buying overlapping warranty coverage. The harm
is that businesses may confuse this approach for
real security. Ultimately, they could end up spending more money and resources on implementing
the wrong security mechanisms without gaining
complete security (Ou, 2007).
Remember the organization is responsible for
maintaining the privacy of the stakeholders consumer while also preserving a harassment-free,
discrimination-free, crime free, and civil business
environment. The development, implementation,
and enforcement of a comprehensive Internet
policy can help in that goal. Whether employees
intentionally violate Internet policy or accidentally surf to an objectionable Web site, under the
legal principle known as vicarious liability, the

Online Privacy, Vulnerabilities, and Threats

employer can be held responsible for the misconduct of the organizations employeeseven
if the employer is completely unaware that there
is a problem.
Simply following security best practice by
limiting access rights may be a good first step,
but it is just a step. No single approach is going to
be totally viable against all malware and protect
privacy. The best protection comes from using a
layer approach. In addition to using technology
it is important to:

Create and enforce policies and procedures


Educate and train
Monitor the network and the systems
Require Penetration testing
Ban inappropriate sites and prohibit wasted
resources and productivity

Aberdeen Groups (2005) research shows that


technology, by itself is not the primary indicator
for successthis was true despite differences in
technology usage, loss rates, or firm sizes. They
also found that organizations performing as best in
class leaders focus on managing four areas to maximize results for the money being spent on security:
1.
2.
3.
4.

Sharing of data and knowledge to improve


results
Processes in place for executing against
objectives
Organizational structure and strategy to
manage to results
A security technology maturity that influences results

Of the four, they indicate that the most important focus area is the managing of data and
knowledge to improve results.
This chapter presented an overview of the
concerns that organizations must address while
working within the Internet community. It was
meant to inform management of the potential

threats and pitfalls that must be addressed to be


a viable player within the Internet realm. While
there are many technical areas that need to be attended to, nothing is more important than ensuring
maintaining the users confidentiality, integrity,
and authenticity (CIA). Hackers and con-artists
are devising clever and inventive techniques to
violate a users privacy for the purpose of committing illegal activities. If left unchecked, these
issues threaten the viability e-commerce and
e-business.

future reseArch dIrectIons


This chapter lays out some of the issues that must
be concentrated on. With the most emphasis being placed upon a strong organizational Internet
privacy and security policy, followed by education
and training of users and stakeholders.
Future research should focus on how large and
small organizations create, maintain, and monitor
privacy and security policies. Because organizations are of differing sizes and have different
resources available, research should investigate
how large and small organizations vary on their
approaches and implementation. Future research
should also focus on how existing protections can
be expanded to protect tomorrows technology.
Finally, research needs to be conducted on how
protecting portable storage devices from misuse,
as this type of media is bound to proliferate.

references
Aberdeen Group. (2005). Third brigadebusiness
value research seriesmost important security
action: Limiting access to corporate and customer
data. Whitepaper. Retrieved October 2007, from
http://www.thirdbrigade.com/uploadedFiles/
Company/Resources/Aberdeen%20White%20P
aper%20--%20Limiting%20Access%20to%20
Data.pdf



Online Privacy, Vulnerabilities, and Threats

Air Defense Press Release. (2005, February


17). AirDefense monitors wireless airwaves at
RSA 2005 conference. Retrieved October 2007,
from http://airdefense.net/newsandpress/02_07_
05.shtm
American Management Association. (2005).
Electronic monitoring & surveillance survey.
Retrieved October 2007, from http://www.amanet.
org/research/pdfs/EMS_summary05.pdf
Basili, J., Sahir, A., Baroudi, C., & Bartolini,
A. (2007, January). The real cost of enterprise
wireless mobility (Abridged ed.). The Aberdeen
Group. Retrieved October 2007, from http://www.
aberdeen.com/summary/report/benchmark/Mobility_Management_JB_3822.asp
Baylor, K. (2006, October 26). Killing botnets
McAfee. Retrieved March 2007, from http://blogs.
techrepublic.com.com/networking/?cat=2
Bernard, A. (2006). McAfees top ten security
threats for 2007. Retrieved October, from http://
www.cioupdate.com/print.php/3646826
Bumgarner, J., & Borg, S. (2007). The US-CCU
cyber security check list. Retrieved November
2007, from http://www.usccu.us/documents/USCCU%20Cyber-Security%20Check%20List%
202007.pdf
Cafarchio, P. (2004). The challenge of non-viral malware! TISC Insight Newsletter, 4(12).
Retrieved October 2007, from www.pestpatrol.
com/Whitepapers/NonViralMalware0902.asp
Cannon, D. M., & Kessler, L. (2007). Dangercorporate data breach! Journal of Corporate
Accounting & Finance, 18(5), 41-49.
CERT. (2007). Vulnerability remediation statistics. Retrieved November 2007, from http://www.
CERT. org/stats/vulnerability _remediation.
html
Clearswift. (2006 October). Simplifying content
securityensuring best-practice e-mail and



web use. The need for advanced, certified email


protection. Retrieved October 2007, from http://
whitepapers.zdnet.com/whitepaper.aspx?&scid=
280&docid=271750
CNET Staff. (2004, September). Spam volume
keeps rising. Retrieved September 2007, from
http://news.com.com/2114-1032-5339257.html
Cochetti, R. J. (2007, June). Testimony of the computing technology industry association (CompTIA), before the house small business committee
subcommittee on finance and tax, sata security:
Small business perspectives. Retrieved October
2007, from www.house.gov/SMBiz/hearings/
hearing-06-06-07-sub-data/testimony-06-0607-compTIA.pdf
Computing Technology Industry Association.
(2004). Annual study. Retrieved October 2007,
from http://www.joiningdots.net/library/Research/statistics.html
Cox, J. (2007, February 9). RSA: attendees drop
ball on wi-fi securitymany IT security experts
at conference used unsecured devices. Network
World. Retrieved October 2007, from http://www.
networkworld.com/news/2007/020907-rsa-wifisecurity.html
Endpointsecurity. (2004). What is endpoint security? Retrieved October 2007, from http://www.
endpointsecurity.org/Documents/What_is_endpointsecurity.pdf
Fielding, J. (2007, January 28). 25% of all computers on Botnets. Retrieved http://blogs.techrepublic.
com.com/networking/?cat=2
Flynn, N. (2005). E-policy best practices a business guide to compliant & secure internet, instant
messaging (IM), peer-to-peer (P2P) and email
communications. The ePolicy Institute; Executive
Director, St. Bernard Software. Retrieved http://
www.securitytechnet.com/resource/security/application/iPrism_ePolicy_Handbook.pdf

Online Privacy, Vulnerabilities, and Threats

Forescout. (2007). NAC enforcement and the role


of the client. Infonetics Research, Inc. Retrieved
July 2007, from www.Forescout.com/downloads/
whitepapers/Infonetics-NAC-Enforcement-andthe-Role-of-the-Client.pdf
GFI. (2007). The threats posed by portable storage
devices. Whitepaper. Retrieved July 2007, from
http://www.gfi.com/whitepapers/threat-posedby-portable-storage-devices.pdf
Glaessner, T. C., Kellermann, T., & McNevin, V.
(2004). Electronic safety and soundness securing finance in a new age (World Bank Working
Paper No. 26). Washington DC Retrieved http://
siteresources.worldbank.org/DEC/Resources/abstracts_current_studies_2004.pdf
Gordon, L. A., Loeb M. P., Lucyshyn, W., &
Richardson, R. (2006). CSI/FBI computer crime
and security survey. Computer Security Institute.
Retrieved November 2007, from http://www.cse.
msu.edu/~cse429/readings06/FBI2006.pdf
Harley, D., Slade, R., & Gattiker, U. (2001). Viruses
revealed: Understanding and counter malicious
software. New York: McGraw-Hill/Osborne.
Higgins, K. (2007, November 9). The worlds
biggest botnets. Retrieved November 2007,
from http://www.darkreading.com/document.
asp?doc_id=138610
Im, G. P., & Baskerville, R. L. (2005, Fall). A
longitudinal study of information system threat
categories: The enduring problem of human error.
ACM The DATA BASE for Advances in Information Systems, 36(4), 68-79.
Juniper Research. (2006, February). Security
information & event management. Retrieved
http://www.juniper.net/solutions/literature/solutionbriefs/351178.pdf
Keeney, M., Kowalski, E., Cappelli, D., Moore,
A., Shimeall, T., & Rogers S. (2005). Insider threat
study: Computer system sabotage in critical infrastructure sectors. U.S Secret Service and CERT

Coordination Center/SEI. Retrieved November


2007, from http://www.CERT. org/archive/pdf/
insidercross051105.pdf
Kirk, J. (2007, May 17). Estonia recovers from
massive denial-of-service attack. InfoWorld, IDG
News Service. Retrieved November 2007, from
http://www.infoworld.com/article/07/05/17/estonia-denial-of-service-attack_1.html
McAfee, J., & Haynes, C. (1989). Computer
viruses, worms, data diddlers, killer programs,
and other threats to your system. New York: St.
Martins Press.
McGillicuddy, S. (2006, November 1). Encrypting mobile devices: A best practice no
one uses SearchSMB.com http://searchSMB.
techtarget.com/originalContent/0,289142,sid44_
gci1227295,0 0.ht m l?a s r c =SS _C L A _
300336&psrc=CLT_44
Muscat, A. (2007, January 17). Perils of portable
storage. Computer Reseller News. Retrieved http://
www.gfi.com/documents/32686_crn_eprint.pdf
Norman, D. (1983). Design rules based on analysis
of human error. Communications of the ACM,
26(4), 254-258.
Osterman Research Inc. (2003). The impact of
regulations on email archiving requirements.
ORI white paper sponsored by Information
Management Research. Retrieved October 2007,
from http://www.Ostermanresearch.com/whitepapers/or_imr01.pdf
Ou, G. (2007) Wireless LAN security myths that
will not die. ZDNet. Retrieved July 2007, from
http://blogs.zdnet.com/Ou/?p=454
Padilla, R. (2007). Root out data breach dangers by
first implementing common sense. TechRepublic.
Retrieved July 2007, from http://blogs.techrepublic.com.com/tech-manager/?p=312
Pai, A. K., & Basu, S. (2007). Offshore technology
outsourcing: overview of management and legal



Online Privacy, Vulnerabilities, and Threats

issues. Business Process Management Journal,


13(1), 21-46.

org/congressional_testimony/Shimeall_testimony_Aug23.html

Privacy Rights Clearinghouse. (2007). A chronology of data breaches. Retrieved October 2007,
from http://www.privacyrights.org/ar/ChronDataBreaches.htm

Shinder, D. (2007, February 9). How SMBs can


enforce user access policies. Retrieved April 2007,
from http://articles.techrepublic.com.com/51001009_11-6157054.html?tag=nl.e101

Provos, N., McNamee, D., Mavrommatis, P.,


Wang, K., & Modadugu, N. (2007). The ghost
in the browser analysis of web-based malware.
Google, Inc. Retrieved http://www.usenix.org/
events/hotbots07/tech/full_papers/Provos/Provos.pdf

Symantec. (2006, September 19). Symantec finds


firms recognize importance of application security, yet lack commitment in development process.
News release. http://www.symantec.com/about/
news/release/article.jsp?prid=20060919_01

Qualys. (2006). The laws of vulnerabilities: Six


axioms for understanding risk. Retrieved October
2007, from http://developertutorials-whitepapers.
tradepub.com/free/w_qa02/pf/w_qa02.pdf
Savage, M. (2007, October 23). Proposed legislation would strengthen cybercrime laws. Retrieved
November 2007, from http://searchsecurity.
techtarget.com/originalContent/0,289142,sid14_
gci1278341,00.html?track=sy160
Schuman, E. (2007, November 14). TJMaxxs
projected breach costs increase to $216M. eWEEK.
Retrieved November 2007, from http://fe42.news.
sp1.yahoo.com/s/zd/20071114/tc_zd/219495
Shaw, K., & Rushing, R. (2007). Podcast,
Keith Shaw (NetWorkWorld) talks with Richard
Rushing chief security officer at ... data, listen
to this podcast. Retrieved October 2007, from
http://www.networkingsmallbusiness.com/podcasts/panorama/2007/022807pan-airdefense.
html?zb&rc=wireless_sec
Shimeall, T. (2001, August 23). Internet fraud,
Testimony of Timothy J. Shimeall, Ph.D. CERT,
Analysis Center Software Engineering Institute,
Carnegie Mellon University Pittsburgh, PA;
Before the Pennsylvania House Committee on
Commerce and Economic Development, Subcommittee on Economic Development, retrieved
October 2007, available http://www.CERT.



Vivisimo. (2006). Restricted access: Is your enterprise search solution revealing too much? Retrieved
October 2007, from via http://Vivisimo.com/ or
http://www.webbuyersguide.com/bguide/whitepaper/wpDetails.asp_Q_wpId_E_NzYyMQ
Wang, H., Lee, M., & Wang, C. (1998, March).
Consumer privacy concerns about internet marketing. CACM 41(3), 63-70.
Webex.(2006). On-demand vs. On-premise instant
messaging. Webex Communications, Ease of
CommunicationsOn Demand EIM Solutions.
Retrieved October 2007, from http://www.webbuyersguide.com/bguide/Whitepaper/WpDetails.
asp?wpId=Nzc4MQ&hidrestypeid=1&categor
y=
Wilson, T. (2007, November 12). ID thief admits
using botnets to steal data. Retrieved November
2007, from http://www.darkreading.com/document.asp?doc_id=138856
Yank, G. C. (2004 December 21). Canning spam:
Consumer protection or a lid on free speech? Retrieved October 2007 from http://www.law.duke.
edu/journals/dltr/articles/2004dltr0016.html

AddItIonAl reAdIng
Bcher, P., Holz, T., Ktter, M., & Wicherski,
G. (2005). Know your enemy: tracking botnets;

Online Privacy, Vulnerabilities, and Threats

using honeynets to learn more about bots. The


Honeynet Project & Research Alliance. http://
www.honeynet.org Retrieved October 2007, from
http://www.honeynet.org/papers/bots/

Henry, P. A. (2007, June). Did you GET the memo?


Getting you from Web 1.0 to Web 2.0 security. (In
Todays Risky Web 2.0 World, Are You Protected?).
Secure Computing Corporation.

Cohen, F. (1984). Experiments with computer


viruses. Fred Cohen & Associates. Retrieved
October 2007, from http://all.net/books/virus/
part5.html

King, S. T., Chen, P. M., Wang, Y., Verbowski,


C., Wang, H., & Lorch, J. R. (2006). SubVirt:
Implementing malware with virtual machines.
Retrieved October 2007, from http://www.eecs.
umich.edu/virtual/papers/king06.pdf

Commission of the European Communities.


(2000). Proposal for a directive of the European
parliament and of the council concerning the
processing of personal data and the protection
of privacy in the electronic communications sector. Retrieved October 2007, from http://europa.
eu.int/information_society/topics/telecoms/regulatory/new_rf/documents/com2000-385en.pdf
Computer Crime Research Center. (2005). Security issues: find the enemy within. Retrieved
October 2007, from http://www.crime-research.
org/analytics/security-insider/
Endicott -Popovsky, B., & Frincke, D. (2006).
Adding the fourth R: A systems approach to
solving the hackers arms race. In Proceedings of
the 2006 Symposium 39th Hawaii International
Conference on System Sciences. Retrieved October 2007, from http://www.itl.nist.gov/iaui/vvrg/
hicss39/4_r_s_rev_3_HICSS_2006.doc
European Parliament and the Council of the
European Union. (2003). Annex 11 computerised
systems, Labcompliance. Retrieved October 2007,
from http://www.labcompliance.com/documents/
europe/h-213-eu-gmp-annex11.pdf
Federal Trade Commission. (1999). Gramm-Leach
bliley act. Retrieved October 2007, from http://
www.ftc.gov/privacy/privacyinitiatives/glbact.
html
Federal Trade Commission. (2006). ChoicePoint
settles data security breach charges; to pay $10
million in civil penalties, $5 million for consumer
redress. Retrieved October 2007, from http://
www.ftc.gov/opa/2006/01/choicepoint.htm

MessagesLabs. (2007). Effectively securing


small businesses from online threats. Retrieved
October 2007, from http://www.messagelabs.
com/white_papers/secure_smb
SANS Institute. (1999, May). Management errors.
In Proceedings of the Federal Computer Security
Conferences held in Baltimore. Retrieved October
2007, from http://www.sans.org/resources/errors.
php
Sarbanes-Oxley. (2002). Sarbanes-Oxley act of
2002. Retrieved October 2005, from http://www.
sarbanes-oxley.com/section.php?level=1&pub_
id=Sarbanes-Oxley
Shinder, D. (2002). Scene of the cybercrime
(Computer Forensics Handbook). Rockland, MA:
Syngress Publishing.
United Kingdom Parliament. (2000). Freedom
of information act 2000. Retrieved October
2007, from http://www.opsi.gov.uk/ACTS/
acts2000/20000036.htm
U.S.A. Department of Health & Human Services.
(1996). Health insurance portability and accountability act of 1996. Retrieved October 2007, from
http://aspe.hhs.gov/admnsimp/pl104191.htm
U.S.A. Federal Trade Commission. (2002). How
to comply with the privacy of consumer financial
information rule of the Gramm-Leach-Bliley
act. Retrieved July 2002, from http://www.ftc.
gov/bcp/conline/pubs/buspubs/glblong.shtm



Section II

Frameworks and Models



Chapter IV

Practical Privacy Assessments


Thejs Willem Jansen
Technical University of Denmark, Denmark
Sren Peen
Technical University of Denmark, Denmark
Christian Damsgaard Jensen
Technical University of Denmark, Denmark

AbstrAct
Governments and large companies are increasingly relying on information technology to provide enhanced services to the citizens and customers and reduce their operational costs. This means that an
increasing amount of information about ordinary citizens is collected in a growing number of databases.
As the amount of collected information grows and the ability to correlate information from many different databases increases, the risk that some or all of this information is disclosed to unauthorised
third parties grows as well. Although most people appear unaware or unconcerned about this risk, both
governments and large companies have started to worry about the dangers of privacy violations on a
major scale. In this chapter, we present a new method of assessing the privacy protection offered by a
specific IT system. The operational privacy assessment model, presented here, is based on an evaluation
of all the organisational, operational and technical factors that are relevant to the protection of personal
data stored and managed in an IT system. The different factors are measured on a simple scale and the
results presented in a simple graphical form, which makes it easy to compare two systems to each other
or to identify the factors that benefit most from improved privacy enhancing technologies.A standardised
assessment of the privacy protection offered by a particular IT system; serve to help system owners understand the privacy risks in their IT system as well as help individuals, whose data is being processed,
to understand their personal privacy situation. This will facilitate the development and procurement of
IT systems with acceptable privacy levels, but the simple standard assessment result may also provide
the basis for a certification scheme, which may help raise the confidence in the IT systems ability to
protect the privacy of the data stored and processed in the system.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Practical Privacy Assessments

IntroductIon
Existing research into privacy enhancing technology (PET) has provided few answers to many of
the real questions that governments and large
companies are facing when they try to protect the
privacy of their citizens or customers. Most of the
current work has focused on technical solutions
to anonymous communications and pseudonymous interactions, but, in reality, the majority of
privacy violations involve careless management
of government it-systems, inadequate procedures
or insecure data storage. In this chapter, we introduce a method that helps system developers and
managers to assess the level of privacy protection offered by their system and to identify areas
where privacy should be improved. The method
has been developed in the context of government
IT systems in Europe, which has relatively strict
privacy legislation, but we believe that the method
may also apply to other government systems, nongovernmental organisations (NGOs) and large
private companies. With the privatisation of many
state monopolies, such as telecommunications
and railroads, in many countries and the increasing number of public/private partnerships, the
distinction between the public and private sector
has grown increasingly fuzzy.1 For the purpose
of clarity in our discussions, however, we have
decided to use the vocabulary from government
systems, so we discuss the relationships between
governments and citizens instead of companies
and customers.
Governments are increasingly relying on information technology to provide enhanced services
to the citizens and reduce the costs of the public
sector. This means that an increasing amount of
information about ordinary citizens is collected in
an increasing number of government databases.
As the amount of collected information grows and
the ability to correlate information from many
different databases increases, the risk that some
or all of this information is disclosed to unauthorised third parties grows as well. Although most

8

citizens appear unaware or unconcerned about


this risk, governments have started to worry
about the dangers of privacy violations on a major
scale. If the government is not seen to be able
to treat information about its citizens securely,
these citizens will be reluctant to provide timely
and accurate information to the government in
the future. Many of the same factors are relevant
in the relationship between companies and their
customers, so both governments and large companies have realised that protecting the privacy
of their citizens and customers is necessary if
they are to reap the benefits of the information
society in the future.
The benefits of collecting and storing information about citizens in electronic databases is an
increasing level of efficiency in administrative
systems and convenience for the citizens, because
it provides government agencies with faster and
easier access to relevant data and improves their
ability to combine sets of personal data from different systems. This allows improved services at
reduced costs, for example, the Inland Revenue
Service in Denmark has reduced the number of
employees from around 14,000 to around 9,000
in the past decade, while the amount of information that is being processed about each citizen has
increased. The sheer volume of data collected by
different government IT systems, however, makes
it increasingly difficult for anyone to obtain an
accurate picture of all the personal information
that may be available in government databases.
Moreover, it also makes it difficult to determine
which persons, institutions or private companies
that have access to the data.
There have been an increasing number of
incidents, where personal information has been
released to unauthorised third parties, either
through carelessness or through negligent administrative procedures. Financial News reports that
JPMorgan Chase recently lost a container with a
backup tape that includes the account information
and social security numbers of some 47,000 of
their Chicago-area clients according to Financial

Practical Privacy Assessments

News Online U.S. (2007). In another high profile


press story, personal information including birth
dates and social security numbers of 26.5million
U.S. veterans was recently compromised when a
laptop containing the information was stolen from
the house of an employee, who had taken home the
information on his laptop without authorisation,
according to Sandoval (2006). Similar examples
of employees who loose sensitive personal data of
clients and customers have become commonplace,
which lead Smith (2006) to declare 2006 The
year of the stolen laptop and the Privacy Rights
Clearinghouse (2007) has compiled a chronology of data breaches, which includes a total of
almost 155 million number of records containing
sensitive personal information involved in security breaches from January 2005 until May 2007.
These examples illustrate that privacy enhancing
technologies alone cannot protect the privacy of
citizens, effective enforcement of security and
privacy policies and awareness among the users
and mangers of IT systems are also necessary.
The many press reports about privacy violations and the increasing risks associated with
identity thefts, mean that privacy is becoming a
major factor for both system architects and end-users. Such concerns lead to analysis and discussions
with varying results depending on the parties
individual privacy concerns and the type of system involved. Privacy enhancing initiatives are
often focused on technical solutions to anonymous
communications and pseudonymous interactions
thus forgetting that privacy is not exclusively a
technical issue. The resulting ad-hoc approach to
privacy protection means that there are varying
degrees of privacy protection in government IT
systems. For the individual citizen, this means
that there is very little insight into the amount
of private data being handled in the government
sector as well as the how, why, and which officials
may be given access to this information. Asking
the citizens permission every time personal data
is collected ensures that the citizen knows what
data has been collected, but it does not ensure that

the citizen remembers what data has been collected and what information can be inferred from
the data. Informed consent, where the individual
citizen agrees to the collection of private data for
a specific purpose, is one of the most important
instruments in data protection legislation, but it is
not a realistic solution to this problem. Moreover,
informed consent only addresses the collection
and authorised use of private information, it does
little to inform the citizens about the way that
their data is stored and what procedures are in
place to keep this data safe.
In this chapter we present an operational privacy model that helps focus the privacy discussions on areas with the most significant privacy
risks. The model defines a method that can be used
to analyse the privacy properties of IT systems,
which are different in their design, functionality,
and the data that they process. The method helps
identify privacy risks in such systems, so that
they may be the subject of further analysis and
discussion. Moreover, the method helps assess the
magnitude of different privacy problems, so that
developers may decide which problem to address
first and citizens may decide whether they wish to
trust the system with their personal data. While
the method may be used to suggest areas of possible privacy enhancements, it does not seek to
provide specific solutions to privacy problems, but
leaves it to the owners of the system to develop
specific solutions for the individual system. In
other words, the goal of the model is to identify
possible privacy risks and provide the basis for
valuable privacy improvements. The assessment
of privacy plays an important role in online
systems because a standard privacy assessment
scheme may provide the basis for a certification
program that will allow individuals to decide
whether they wish to interact with a particular
service provider. The operational privacy assessment model proposed in this chapter is based on
factual answers to a few simple questions, which
makes it ideally suited for a certification scheme.
The results of a privacy assessment are presented

9

Practical Privacy Assessments

in a simple graphical form, which allows a layman to determine the overall privacy protection
in the examined system and identify areas with
particularly high privacy risks.

prIvAcy In government
It-systems
There have been a number of proposals for a
definition of privacy, but although some have
been widely used, none have been able to meet
the demands of new technologies and variations
in cultures. This is why the concept today is
somewhat diffuse and the definition depends
highly on the context in which it is used.
Generally, privacy can be described as a form
of knowledge about our existence which we may
wish to control as it is something others potentially
can use against us. This knowledge can be further
divided into three categories: knowledge about
our person, knowledge about our relationships,
and knowledge about our behavior.
In this classification, knowledge about our
person covers information regarding physical
factors, such as information about our health,
financial situation, consumer profile, current
location, and recent movements. For practical
reasons, it is common for individuals to share
chosen parts of information from this category
with selected persons or institutions. For example,
health information is often shared with doctors
and the health insurance agency, but this does not
automatically confer a wish to share this information with friends or co-workers.
The category knowledge about our relationships covers information about relationships
with other persons or institutions. Examples of
such relationships could be political and religious
convictions as well as family, sexual, work related, and other relationships with friends. Even
though most of these relationships involve at
least one other party, the knowledge about their
existence can still be sensitive and private for
the individual.
0

The final category knowledge about our


behavior covers information about us, which is
not necessarily based on something physical. This
can be information about our interests, habits,
priorities, life-plans, and general preferences.
While this information is not directly measurable, it can often be deducted from the actions of
the individual, such as shopping profiles, library
visits,2 and social circles.
These three categories cover most aspects that
define the identity of a person: the relation of this
person to others and the actions of the person in
this world, which allows a complete compilation
of information about an individual; information
that the person is likely to want to exercise some
control over.
Another reason why privacy is such a diffuse
and widely discussed term is that it covers a large
area. In order to work with the term without risking
that it becomes too vague, it has to be narrowed
down to what is relevant for the context of the
specific task.
The information which government systems
requires and uses, is usually information about
different physical entities or attributes, this is
different from the private sector which also uses
behavioral information to compile customer profiles and promote sales. The government systems
manage a lot of information which falls under the
category knowledge about our person. Even
though the amount of information is extensive,
it is still only a subset of the total amount of data
covered by this category.
By limiting the definition of privacy for the
specific task of evaluating the privacy risks in a
government IT system, we only consider privacy
issues that are relevant in the context of administrative IT systems, so important privacy issues,
such as physical surveillance through CCTV, are
not addressed in this chapter.
The government sector uses the personal information about an individual for identification
and interaction purposes, for example, name,
date of birth, location of birth, current address,

Practical Privacy Assessments

and so forth. For this reason, these data are extremely sensitive. The consequences of a breach
in the security of these data can be negative for
both the individual and for the government institutions that are responsible for the data. Since
these data serve as the basis of identification of
an individual, a breach can in worst case, lead to
extensive problems with identity theft.
Common for personal data stored in government systems is that they are gathered for a specific
purpose. Whether this purpose is fair or is in
itself a breach of privacy is a completely different
discussion concerning the merits of the specific
system versus the limitations to the privacy of
the individual. This chapter does not enter this
discussion beyond looking at whether a system
has a given purpose for collecting private data or
not. The goal must be to limit access, use, and
processing of personal data as much as possible,
while still acknowledging the purpose for which
the data has been gathered.
With this focus on personal data in government
systems, we can finally make a task-specific definition of privacy. This definition is only applicable
for use in the model described in this chapter and
as such is not an attempt to make a complete and
general definition of privacy.
Privacy is the influence on and knowledge about
the existence and use of personal data in government digital records.
The requirement of knowledge about the
information that is stored about the individual,
gives the individual an important tool to know
exactly what personal information the government has registered as well as the purpose of
this registration. The requirement of influence
addresses the important goal of being able to
control who gets access to personal data. We use
the term influence, instead of the stronger term
control, because government systems usually

have a purpose that rarely allows the individual


to completely opt-out.
This chapter does not aim to propose solutions or recommendations of technical tools for
protecting privacy in specific systems or generally. Instead, we define a method for evaluating
systems from a privacy point of view, the result
of which can be used to focus further work on
improving privacy in a given system.

prIvAcy protectIon
With the growth of online systems and the digitalisation of administrative procedures, the interest
in the topic of user privacy is rapidly increasing.
Much of the work being done in securing privacy
is being focused on defining and developing
technical tools and standards for the protection of
private information according to Pfitzmann and
Khntopp (2000) and the European Parliament
(1995). These tools seek to maximise privacy in
systems by minimising the amount of and restricting the access to personally identifiable data as
well as the amount of information that can be
gathered from the interactions between private
parties and services. However, these methods
are often hampered or outright prevented by the
functional requirements of the systems they seek
to secure. For example, many financial systems
require the real name of its users in order to meet
accountability requirements, so these requirements prevent the system from implementing
privacy through anonymity or pseudonymity.3
When the requirements of the individual systems
prevent the implementation of technical privacy,
other means are needed to ensure the best possible
level of privacy protection. While the technical
privacy enhancing technologies may be readily
available, their full effect on the privacy of a
system is often hard to measure, especially in
the cases where these tools cannot be used to
their full effect.



Practical Privacy Assessments

technical privacy Issues


According to Pfitzmann and Khntopp (2000),
contemporary privacy research, tools, and general
public interest is largely focused on the technical
issues of anonymity, pseudonymity, unobservability, and unlinkability. These four concepts also
define the privacy class of the common criteria
security evaluation described by International
Organization for Standardization (1999). The four
concepts define different levels of anonymity,
from pseudonymity, which provides some level
of accountability at the risk of allowing profiling,
to unlinkability, where two transactions issued by
the same anonymous user cannot be linked to each
other. The technical tools for protecting privacy
are very efficient, but they often impose serious
restrictions on the applications and generally introduce a considerable performance penalty. This
means that there is often a direct conflict between
privacy concerns and the desired functionality of
a given system.

operational privacy Issues


In cases where functional requirements limit the
applicability of privacy enhancing technologies,
there should be other safeguards in place to ensure the best possible level of privacy protection.
Usually, such safeguards consist of legislation
and operational procedures that ensure that data
is handled with care. Legislation and operational
procedures cannot directly enforce privacy policies, so they cannot guarantee that privacy risks
are eliminated. The work presented in this chapter
addresses systems where technical tools alone
cannot secure privacy. Working with practical
privacy protection starts in the systems design
phase where the required amount and structure
of personal data is determined and where input
from laws and interest groups are essential. The
legislation must be considered to ensure that the
system conforms to the law; both with respect to



what information should be collected and with


respect to what information must not be collected.
External interest groups often have useful perspectives on the scope of information that should
be collected and the necessary procedures for
auditing the use of data. Once the system is up
and running, it is still possible to improve privacy
by ensuring that only properly trained personnel
have access to personal identifying data, as well as
improving the amount and quality of information
available to the registered individuals. Properly
defined operational procedures are therefore an
important element in the protection of privacy,
because they instruct personnel what data should
be collected and how that data should be handled
so that legislation is followed.

Influence on Privacy
The level of privacy protection that is required for
a specific system depends on the type of system
and the type of personal identifiable information
that is managed in the system. There are three
primary categories that influence privacy by defining demands that the system must meet.

Actors
Actors are individuals and institutions, such as
political parties, interest groups, and industry
lobbyists, whose work directly or indirectly influence the public opinion about privacy. Their
mutual interests and conflicts can often influence
the level of privacy protection implemented in
a government IT system. Actors that influence
privacy rarely aim to violate privacy directly.
Often, it is the negative side effects of suggestions to change other parts of the system, which
end up influencing privacy in a negative way, for
example, proposals to increase surveillance as
part of anti-terror legislations are meant to save
innocent lives, but they are obviously detrimental
to privacy. Privacy activists, on the other hand,

Practical Privacy Assessments

often work directly with privacy with the aim to


protect the private sphere.
There are three main types of actors:

Government actors are identified by owning


or controlling one or more public systems.
The actors relationship with personal data is
often a wish to reduce the workload through
use of IT systems, which often includes
processing of personal data.
Commercial actors are identified by being
driven by a commercial interest. An actor
can be an industry association or a private
company. The operation of a government IT
system can sometimes be outsourced to a
commercial actor. Many commercial actors
have interest in the data they collect, which
may be used for marketing purposes.
Non government organisations (NGO) are
a large group of different organisations.
These are identified by being motivated
by the members political or social goals.
NGOs are not motivated by commercial or
government interests.4

General for all actors is that their willingness to


work for a solution to a specific problem depends
on how important the problem is within the scope
of the actor. Another factor is how controversial
a given system is, because actors often want to
set their mark on high profile cases.

legislation
Although laws are the result of the work of politicians, they are still a decisive factor in government systems, because they are more consistent
and stable than the mind of politicians and other
actors.
An example of such legislations is the Data
Protection Directive from the European Parliament (1995), which is defined to protect personal
data. The goal of this directive is to harmonise

the privacy laws in the individual member states.


All member states of the EU have to follow the
directive, so there is a relatively high level of
privacy protection in the European Union and
the directive ensures that privacy is protected
when new legislation is introduced in any member
state of EU.
This legislation protects personal information
in both commercial and government systems.
The legislation also prevents data form inside the
EU from being transferred to countries outside
the EU, unless the country guarantees the same
level of privacy protection as countries in the EU.
It protects individuals, giving them some basic
rights, for example, to access data, to know the
origin of the data, and to withhold permission to
the use of data in some cases.
The advantage of defining privacy rights in
legislation is that public institutions not always
have an obvious interest in the protection of privacy, but they always have an interest in following
the law. Without legislation, the developers will
design the systems according to their own ideas,
but the law ensures some basic rules for privacy
protection.

culture
Culture is an important factor in the perception of
privacy, and it often decides when privacy issues
are of interest. Cultures with a relaxed attitude towards the protection of privacy are also less likely
to protest against the development of systems to
process personal data. Even within the EU, there
are large variations between the citizens privacy
concerns in the different countries. According to
a privacy poll performed by the European Opinion Research Group (2003), countries such as
Denmark, Spain, and Portugal have populations
where only 13% are very concerned about their
privacy, but the populations of countries such as
Greece and Sweden are much more concerned
about privacy with 58% and 54% of their respective
populations stating that they are very concerned



Practical Privacy Assessments

about their privacy. It is interesting to note that


these large differences occur in spite of similar
data protection laws within the European Union,
which suggest to us that the culture of the individual country may play an important role in the
privacy concerns of its citizens. The result of this
variation is that systems, which in some countries
would be faced with severe public concern, might
be more easily introduced in other, less privacy
concerned, countries.
There is a huge difference in how the three
categories, actors, legislation, and culture influence privacy in government systems. There are
actors, who drive new initiatives. Legislation,
on the other hand limits the development of systems. Culture is also an important factor, since
it influences the values that can limit the use and
development of new systems.

though this is conflicting with legislation.


Political change can change the foundation
for new privacy legislation. Even though
the change might not be introduced to limit
privacy, optimisation or detection of benefit
fraud can introduce circumstances where
privacy is violated beyond the people committing the fraud. In this case, both technology and previous legislation help identify
risks of the change.

These examples all show that both technology


and legislation in different ways influence the
development of privacy, and protect the individual
from harm of future change.

prIvAcy enhAncIng
technologIes

privacy maintenance
Privacy is not clearly defined and the composition
of the three categories of actors in the privacy
field tends to change over time, so the privacy
risks also change over time. This is important
to keep in mind when designing new systems or
legislation, because both will have to be updated
at regular intervals. There are two main areas that
serve to maintain the privacy of citizens, the
legislation based protection and the technology
based protection. The risk of placing protection
in legislation is that it can be changed. What is
legal today might not be legal next year. Therefore, there are different factors that have to be
considered:



The development of technology can open


for new ways to use data. In this case the
legislation could limit the influence of this
factor.
The environment of the system (the world
around the system), a good example of this is
the anti-terror legislation. Privacy may here
be protected by technology protection, even

Privacy enhancing technologies (PET) are the


technical tools used to protect privacy. The term
does not cover all privacy enhancing methods
as there are many means to protect privacy that
do not involve technology. This means that for
a given system, which processes personal data,
there is a whole range of elements that influence
privacy. Again, these elements are not limited to
technical aspect of data security, but also concerns
practical aspects such as the privacy awareness of
the system builder, the system operators, and the
individual user. Another aspect is the availability
of contact points for the registered individual. As
such, interaction between government and private
persons present a range of potential privacy-risk
areas; communications, data-storage, and dataprocessing:

Communication: This privacy risk area


concerns the risk of someone eaves dropping
when private information is sent or received
across any communications channel. This
could be a person sending an online request
to the local library or someone filling in

Practical Privacy Assessments

their tax return online. This area concerns


using encryption schemes such as public
key encryption described by Rivest, Shamir,
and Adleman (1978) and the International
Organization for Standardization (2006), but
also hiding the sender and receiver of the
messages as developed by Chaum (1988).
Data-storage: This concerns the risks
associated with storing private data electronically for later use. Later use could be
report production or data-mining. Stored
data could be the loan records stored in the
library systems or the tax returns. Stanton
(2004) describes how both physical and
cryptographic mechanisms exist to prevent
intruders from accessing stored data either
remotely or through common theft of the
storage media.
Data-processing: This covers the risks that
arise when personal information is somehow
used to make decisions. This concerns both
human processing where a person evaluates
the data, and automated processing where a
computer makes the evaluation. An example
of a process like this could be a librarian
registering the return of a book or a tax
auditor examining a tax return. The area
covers employees access, such as role-based
access control described by Ferraiolo and
Kuhn (1992) and interaction with data as
well as the registered individuals control
(Blaze, Feigenbaum, & Lacy, 1996) and
information procedures.

These three terms cover the spectrum of privacy risks in government systems, but they are
far too generic to base an exact evaluation on.
Therefore these three areas are decomposed into
their privacy relevant components.

into hiding the very fact that a communication


is going on, since the fact that you are talking
to someone may be private information. For
example, an employer that monitors outgoing
telephone calls may discover that an employee
has called a recruitment agency, which breaks the
privacy of the employee, although the contents of
the conversations are unknown. There are many
technical tools that protect the communication
channels, such as SSL5 and Pretty Good Privacy
created by Zimmermann (1995). Pfitzmann and
Khntopp (2000) found that there are many issues
that have to be addressed in secure communications, but most of these can be generalized into
three main areas:

Anonymity: Defined as being unable to


identify sender and receiver within a space
of possible senders and receivers
Unlinkability: Defined as being unable to
relate separate message even with knowledge
of the channels being used. This ensures that
a third party cannot link repeated communications with the same or different resources,
back to a user.6
Unobservability: Defined as not being able
to distinguish messages sent between two
points, they will look like any other message
or random noise.

Pseudonymity is defined by Pfitzmann and


Khntopp (2000) as the use of pseudonym IDs;
this is worth mentioning here, because it is often
the best solution in systems where total anonymity
is impossible or impractical. While pseudonymity is an improvement over directly identifiable
information, it cannot match the privacy of anonymization of data as long as it is reversible.

data-storage
communication
Privacy in communication is very much a matter
of securing the transmissions. But it also extends

Storing data securely is also a task that relates


to IT security. The task here is that data must
be protected against both theft and loss through



Practical Privacy Assessments

negligence. While a locked vault can guarantee


protection against theft from outsiders, it does not
protect against employees losing their laptop with
copies of private data on it. This means that not
only must the physical protection be considered,
but also whether the data itself is protected, for
example encrypted, on disk using something like
an encrypted file system as described by Zadok,
Badulescu, and Shender (1998).
In the context of privacy, the sensitivity of data
being stored is also important, that is, the more
sensitive the data being stored is, the larger the
risks and consequences of theft will be. Therefore,
it is important to assess the degree of sensitivity
of the data stored in the system and the possible consequences of a privacy violation (e.g.,
caused by a security breach). This highlights the
importance of storing as little sensitive data as
is possible.
Data-storage also concerns providing access
to the stored data to authorised personnel. This
raises the problems of who should be authorised
to see the information and how much information
an individual employee should be allowed to see.
The answers to these questions differ a lot from
system to system but also within a single system,
as there can be large variations in the level of access required depending on the persons function.
For a government system, it is possible to identify
five separate interest groups relating to a systems
data. The first group is management who has an
interest in the proper treatment of data, but who
is also interested in increasing efficiency thus
reducing costs. The second group is the employees
who, idle curiosity aside, primarily are concerned
with getting the data they need to do their job. The
third group is the society at large, which has an
interest in the datas use in statistical reports and
scientific research but also that current laws and
regulations are upheld. The fourth group consists
of commercial interest groups that may have interests in using private data for targeted marketing
and product development; generally this interest
extends to getting access to as much data as pos-



sible. Finally, the registered users themselves have


an obvious interest in their private data, that it is
treated responsibly and well protected.
With the large variation in the protection of
stored data, it is important to also look at the privacy issues in access control, because it is important to restrict the access to only those who really
need it, and only when they need it. In a report
published by the European Parliament (2006) it is
found that many security breaches today are made
by employees or others with legitimate access to
data and it is important to consider this problem
as well when protecting privacy

data-processing
A very privacy sensitive person may argue that
privacy is broken whenever personal data is being
processed. However, in government systems there
is often a reason for the processing of personal
information, which just has to be accepted. The
discussion of whether the purpose of a specific
system in itself is a breach of privacy, such as the
current transfer of airline passenger data between
European Union and United Stats of America
(2007), is another discussion. Here we focus on
the three components of data-processing that are
important in a privacy evaluation.

Confidentiality is important when private


data is processed. Any persons with access
must know and follow the requirements of
confidentiality. Furthermore, it is important
that only the data directly relevant to the
process is made available, so that peripheral
information does not affect the results of the
process.
Completeness means that the data used
in the process must be both complete and
correct to ensure that faulty decisions are
not made based on the data. Erroneous processing could also lead to further breaches
of privacy.

Practical Privacy Assessments

Insight is important for the people whose


personal data is being processed. Insight
means access to information about what
data has been used in what processes and
what the conclusions are as well as who did
the processing. This allows citizens to either
accept the use of personal data or to raise
objections if they feel that their privacy is
being violated.

A further development of ensuring confidentiality, completeness and insight, would be


to allow a person lesser or greater control over
data-processing involving his or her personal
data. While this quickly becomes complicated
when considering large systems, it is still desirable from a privacy point of view.

frameworks for privacy evaluation


There have been some initial efforts to define a
common framework for discourse about privacy.
Lederer, Mankoff, and Dey (2003) have done
similar work on dividing the different privacy
problems into more general categories in order to
improve the clarity of discussions. Their work aims
to improve discourse on the subject of privacy in
regards to different technical tools and protocols.
Their work is, however, not immediately applicable when evaluating a single system which may
include several privacy enhancing technologies.
Their work allows for comparisons on the effects
of different privacy enhancing tools but does not
extend to comparing systems.
Yu and Cysneiros (2002) have presented a
framework to help design systems while satisfying, the sometimes contradicting, demands of
privacy, usability, and functionality. Their work
establishes a method for engineers to evaluate
the effects that different design decisions have
on the privacy and general usability of the system. The framework is, however, very detailed
and does not naturally extend into non-technical

privacy improvements, which limits its use for


non-engineers.

operAtIonAl prIvAcy
Assessment model
In the following, we present a method to assess
the level of privacy protection offered by government IT systems that contain sensitive data. The
model defines a method to identify and highlight
privacy issues either during the development of the
system or as an evaluation of an existing system.
The method gives a standardised evaluation of
privacy, which is not influenced by the current
state of technology or current moral standards.
This evaluation can then be used to apply more
focused solutions for upgrading privacy, solutions
which are tailored to the design of the specific
system and its functional requirements.
Applying the model to a system in its design
phase allows the system designer to evaluate the
privacy issues of the current design and the effects that different design choices may have on
privacy. Focusing on these issues already, before
implementation, makes it significantly easier to
ensure minimal privacy risks in the final system.
The method does not prescribe or recommend
specific solutions to these privacy issues, but only
highlight areas that could benefit from additional
privacy protection. The method also produces
results in a simple graphical form that can be used
to compare two possible design alternatives for
the current system or two systems with similar
functionality, which can be useful when having
to choose between systems.

Identifying privacy factors


The model divides the factors that influence privacy into different categories, where each category
describes an element of privacy protection. Using
the component breakdown in the fourth section,



Practical Privacy Assessments

we have defined a range of privacy factors, which


are divided into seven main categories that are
explained below. The division into categories helps
keep the model simple to work with. Each category
has its distinct privacy focus, which decides the
elements it contains. It is important that the factors cover all the categories of privacy identified
in the second section as well as the problem areas
described in the fourth section, to ensure that the
result of the model gives a complete image of the
privacy situation. Each category may be further
divided into subcategories that cover more specific
fields of privacy, but are still within the scope of
the main category. The different categories and
subcategories do not only focus on technical
solutions, but also on practical aspects relating
to the operation of the system, such as how is
data stored?, who has access to the data?, and
what personal data is recorded? The categories
are defined as follows:
Data protection: This category covers the
risks concerning data being leaked or stolen. This
is the area of privacy that prevents an outsider
from gaining direct access to the private data,
either where it is stored or in transit over the
network. The category is to a high degree based
on traditional IT security tools. The major areas
of concern are:

The focus of this category is very technical


and is considered a traditional topic in IT security.
Encryption and physical security are well known
and widely used security mechanisms and though
they are not designed as specific privacy tools,
the effects they achieve amounts to the same. A
system failing to get a decent score in this category
not only has problems with privacy, but problems
with the general security of the system as well.
Sensitivity: This category covers the sensitivity of data stored and managed by the system.
The sensitivity is an expression of the amount of
damage a privacy breach could do. The category
identifies privacy risks in a given system, so it
must include the techniques that are used to protect data, in case it is leaked or stolen, such as
separation of sensitive data, from less sensitive
data. This means that the category works to accomplish a measurement of the sensitivity of the
data, as well as whether this sensitivity has been
lowered by separating identifiable information
from the rest of the data. The focus areas of this
category are:

8

Storage of private data. For private data,


both the physical security of the storage
media is important, but also that the data is
encrypted should a theft occur.
Communication of private data. This covers the security of communication channels carrying private data between users
of a system. In order to ensure privacy, it
is important to consider both the security
of the data packets and the identity of the
communicating parties, that is, hiding the
identities of the sender and the receiver to
prevent traffic analysis.

Risk profile which is a measure of the sensitivity of the data in a system. The measure
ranges from items of low sensitivity, for
example, phone numbers, to items of high
sensitivity such as medical records.
ID-separation is a measure of the degree in
which identification data has been separated
from the operational data in a system. The
use of pseudonyms, instead of real names,
would provide a low separation, while use of
completely anonymized data would provide
a high degree of separation.

The possible score of this category is highly


dependant on the context and the function of the
system, for example, medical journaling system
for an emergency room will probably never be
able to get a perfect score.

Practical Privacy Assessments

Environment: This category covers the


entire environment that surrounds the system.
The environment includes all aspects relating
to privacy, from the ability of interested parties
to comment on and influence the development
process, to the production environment of the
system. The category seeks to assess the work
that has been done to ensure that the system has
been subjected to examination and comments have
been received by special interest groups and the
like. It also examines whether the system is seen
to conform to applicable laws and to what degree
parts or the whole of the system is outsourced
to other companies with different interests or to
other nations with different cultures. The major
areas of concern for this category are:

Influence concerns whether the concerns


and comments of public interest groups have
been sought and whether they have come to
a common understanding.
Law and regulation must be followed and
it is important to note whether there has
been a professional to check the systems
compliance.
Outsourcing can lead to problems arising
from differences in culture and interests.
Outsourcing covers not only outsourcing to
other countries but also outsourcing from
government agencies to private companies.

This category is mostly important during the


development phase of the system. Its purpose
is to prevent closed and secretive development
of systems handling private data by emphasising the values of open and publicly scrutinised
systems.
Surveillance: This category covers the surveillance capabilities that the system provides to
the operators of the system. This includes who has
interest in the data, and how easy it is to link data
from this system to other datasets. The category
establishes to what degree the system limits its

gathering of data, as well as how much it uses of


common identifiers that can easily be linked to
other datasets. The focus areas are:

Limiting data is about ensuring that only


the necessary data are collected and processed in a system. In order to minimize
privacy issues a system should resist the
lure to collect more data than needed, that
is, the system should not collect data which
is nice-to-have, but should limit itself to only
collect data that it needs-to-have.
Common identifiers concern the use of
identifiers which are used in other databases
and therefore allows for data to be joined
from one or more separate databases. Identifiers such as social security numbers are
especially dangerous in this regard.

While data-mining may solve a number of


important problems for governments and large
organisations, it is a very large threat to an
individuals privacy. While data contained in
individual databases may not present a serious
privacy threat in itself, the combination of data
from many databases may provide the ability to
infer large amounts of private information about
an individual.
Ordinary use: This category covers how data
is processed when it is used on a daily basis. The
category examines how risks of data leaks are
handled, and what is done to reduce individual
users access to sensitive data. This important
category establishes how the operators of the
system are trained to handle such data and to
what degree they are restricted in their data access. Finally, the category examines the operators
ability to remove the private data from its security
framework by, for instance, printing the data. The
category focuses on the following areas:

Education covers the training received by


the personnel, such as case workers, that are
actively working with peoples private data

9

Practical Privacy Assessments

on a daily basis. Basically, this establishes


whether the employees have been informed
or educated about privacy and the data they
have access to.
Access control concerns to what degree
the systems restrict individual employees
access to data. Generally, tighter restrictions
on data access are better for privacy.
User storage is generally about the ability
of the systems user to remove private data
from the security framework, for example,
by taking data home on their laptop or by
printing cases.

Often within IT security, the greatest threats


to a systems comes from the employees. This is
no different for privacy where it may be much
more common as something as casual curiosity
can be difficult to detect or prevent.
Transparency: This category covers the
level of information that the system provides
to its users about privacy risks in the system.
For example if the system is able to tell the user
why and for what purpose the data is collected.
The category also examines to what degree the
users are able to influence the way that data is
processed in the system. More specifically, the
category examines how well the system manages
to inform the registered people about the purpose
of the registration and the system, contact points,
the registered persons rights, and exactly what
data are processed. The major areas of concern
in this category are:

0

Purpose concerns whether the system has


a well defined purpose, which motivates the
collection of private data and whether the
registered individuals are informed about
this purpose.
Contacts concerns how easy it is for individuals to get in touch with representatives
of the system, making it possible to get
information or make complaints about it.

Information of rights covers to what degree


a system informs registered individuals of
their rights, if any, relating to the processing
of their private data.
Information of data covers the amount
of information that registered individuals
receive from a system about what data has
been collected. This also extends to any extra
information the individuals receive about
the processing of their data. For example,
the names of case workers and such.

This category of privacy is less about preventing privacy breaches and more about ensuring
that every person whose private data is collected
and processed is informed of this and given the
necessary knowledge and opportunity to ask
questions and raise objections.
Control: This category covers the controls
that the system implements to check the use and
storage of the data. This includes both the users
control of data correctness and external audits
of the system. The category examines the users
own level of control with data in the system, but
also the level of outside impartial controls to
ensure that the system is only doing what it is
supposed to do. Finally, the category examines
the systems own functions to ensure that data is
correct and coherent. The major focus areas of
the category are:

Registered individuals control concerns


the level of control an individual is given
in the processing of his or her data. Such
controls could include the ability to decide
who gets access to the data and who should
be denied access to the data.
Data correctness concerns the quality of
the private data. This examines whether a
system double-checks data with other data
sources and/or whether individuals are encouraged to check and correct errors.

Practical Privacy Assessments

Audit covers the controls of the system and


its functions. A higher score is given for
audits performed by independent sources
but even internal audits are better than no
control.

The category examines the controls that ensure


a running system is performing according to its
purpose. With private information being held in
the system, the registered individuals have an
interest in being able to check that this information is not being misused. This also helps to build
trust between the system and the users. Audits
may be introduced to ensure that the system performs correctly, even when the users are failing
to check the system or when they do not know
how to check the system. Finally, auditors may
also be allowed to check parts of the system that
are not visible to the users.

Privacy and Privacy Factors


The privacy factors are derived from a compilation
of several sources. The model emphasises the need
to cover not only the traditional privacy protection in the form of classic IT security and related
technical tools, but also the practical handling of
privacy data. The traditional security factors are
according to Jansen and Peen (2007) partly identified by the common criteria security evaluation
scheme which has a category covering privacy.
This set is further expanded on by the focus areas
of privacy enhancing technologies as described in
the fourth section. A state-of-the-art analysis into
current privacy enhancing technologies is used
to identify the possibilities in privacy protection.
It is this set of possible privacy tools that define
the technical factors in the privacy model and
the score of each factor (as each factors score is
based on how well privacy is protected in relation
to how well privacy can be protected). The rest
of the factors concern the more practical privacy
protection or risk reduction. These factors are the
ones focusing more on the operational privacy

issues covered in the third section. These factors


are defined so that they cover the privacy risks
arising from the different interests and motivations of the actors in a system. These factors also
cover legislation and cultural issues. Combined,
this aims to cover the definition of privacy in
context as defined in the second section.
Some overlap does exist between the technical
and the practical factors as a result of similarities
in the cause of the specific privacy risk. In these
cases, the factors have been kept separate to assess
the technical and practical solutions separately,
as they are not mutually exclusive.

Measuring Privacy Factors


The results of the evaluation of each category
will finally be used to sum up a final result. To
produce a result set which can be compared to
other sets, it is important that the results are treated
uniformly. The operational privacy assessment
model defines a series of questions designed to
determine the level of privacy protection that the
system provides within each category. The specific
questions each address one of the issues covered
by the category. The questions are designed to
result in a percentage score ranging from 0% in
cases of the highest privacy risk, to 100% in cases
of the lowest possible privacy risk. Each possible
answer to a question, results in a score that reflects
the possible steps taken (or lack of steps taken) to
reduce the privacy risks. Note that the score goes
towards 100% as you approach the steps taken to
ensure privacy which are practically possible, and
not just theoretically possible. These questions
are compiled into a questionnaire that should be
answered by someone who knows about the design
and operation of the system within the different
areas covered by the questionnaire. Typically, the
head of IT services will be able to answer most
of these questions, but she may delegate some
questions to someone who are more familiar with
a particular topic. The answers to the questions
within each sub-category are aggregated into



Practical Privacy Assessments

a single score for each of the seven categories,


which may then be plotted on a chart that shows
seven percentage values, one for each category.
The score for each sub-category is calculated by
simply reading of the recorded answer to each
question, which is always a percentage value.
If there are several valid answers to a particular
question, the assessment method selects the one
that gives the lowest score. This enforces the policy
that no chain is stronger than the weakest link.
The result of each category is then calculated as
the average of the questions in each sub-category.
The averaging produces seven discrete results,
one for each of the above categories.
Example: Consider a system that contains
sensitive personal data. An evaluation of the
category Sensitivity requires an evaluation
of two subcategories: ID-Separation and Risk
Profile, which may receive the following scores:
ID-Separation = 0% because the person is
completely identified (i.e., the highest possible
privacy risk) and Risk Profile = 50% because
the sensitivity of personal data is medium (i.e.,
the medium of possible privacy risk).
With these results, the final result is calculated as a simple average, which gives 25%. This
category score is then used as one of the seven
scores in the final result.

Privacy Assessment Questionnaire


As previously mentioned, the practical assessment of privacy is based on a series of questions
answered by a person who is familiar with the
system that is being evaluated. These questions
are presented in the form of a questionnaire, where
each question has a few standard answers that are
associated with a numerical value. The scale of
these values is equidistant, which means that there
is no way to differentiate between systems that
offer the same type of protection with different
degrees of security, that is, there is no difference
in the value of a server that is locked in a normal
server room and a server locked in a highly se-



cured area, such as a bunker. This is done because


precise weights on this type of issues depends
on too many factors that differs from system to
system, so we have decided on a unified scale, so
the differences in security is only made evident
by a combination of questions. The underlying
categories for each question are identified in the
section on identifying privacy factors and the way
we associate numerical values with each answer
is presented in the section on measuring privacy
factors. The first page of a standard questionnaire
is presented in Figure 1.
In the following, we present the questions for
each category and the numerical values associated
with each of the possible standard answers that
we have identified.
Data protection: This category considers
the protection of data that are being stored in the
system as well as data being transmitted between
components in the system. The data protection
classes are shown in Table 1.
Sensitivity: This category describes the
sensitivity of the data managed in the system.
This includes a risk profile based on the inherent
sensitivity of the data and id-separation, which
describes how easy it is to link data to a physical
(real world) identity. The sensitivity classes are
shown in Table 2:
Environment: The questions pertaining to the
environment focus primarily on factors relating to
the development of the system. Questions include
whether relevant external organisations have been
consulted in the development process, whether
competent advice has been sought regarding the
legal aspects, and whether parts of the development
has been outsourced to another country, where
the privacy legislation or culture of community
living may be different. The standard environment
classes are shown in Table 3.
Surveillance: This category addresses the
surveillance capabilities that the system offers
to the operators of the system. These capabilities
are normally controlled by limiting the amount of
data collected and preventing linkability through

Practical Privacy Assessments

Table 1. Standard data protection classes

Table 2. Standard sensitivity classes


Risk profile

Storage
1.

No protection

0%

1.

Highly sensitive data

0%

2.

Protected against physical theft or encrypted

50%

2.

Sensitive data

50%

3.

Protected against physical theft and encrypted

100%

3.

Normal public accessible data

100%

ID-separation

Communication
1.

No protection

0%

2.

Protected against wire tapping but not


against identification

50%

3.

Protected against wire tapping and identification

100%

the use of common identifiers. The standard surveillance classes are shown in Table 4.
Ordinary use: This category focus on problems relating to the day to day operation of the
system. This includes the education of staff with
respect to privacy issues, the implementation
of the necessary access controls to enforce the
need-to-know principle, and the ability to make
off-line copies of data that are not subject to the
controls implemented by the system. The standard
ordinary use classes are shown in Table 5.
Transparency: This category focus on the
registered persons ability to ascertain the purpose

1.

No protectionidentification is possible

0%

2.

Responsible pseudonyms

33%

3.

Pseudonyms data

66%

4.

Full protectionanonymised data

100%

for registering personal information, contact the


agents who operate the system, find out what
rights they have with respect to the registered
data, and the actual right that registered people
have with respect to the private date managed in
the system. The standard transparency classes
are shown in Table 6.
Control: The final category addresses the
external controls imposed on the system by the
registered users and auditors. These controls
include the procedures put in place to ensure the
correctness of data managed by the system. The
standard control classes are shown in Table 7.

Table 3. Standard environment classes


Influence
1.

Developed without from external organizations help or users

0%

2.

Developed with a few external organizations or users

33%

3.

Developed with external organizations and/or users

66%

4.

Developed with many external organization and users, with high public attention

100%

Law and regulation


1.

Developed without legal advice

0%

2.

Developed legal advice

50%

3.

Approved by the authorities

100%

Outsourcing
1.

Developed with all development outsourced to a foreign country

0%

2.

Developed with some/all of the development outsourced to a company that is covered by the same laws

50%

3.

Not outsourced

100%



Practical Privacy Assessments

prIvAcy Assessment

Table 4. Standard surveillance classes


Limiting data
1.

Collection all possible data, for possible later


data processing

0%

2.

Collection of some extra data

50%

3.

Limited data collection

100%

Common identifiers
1.

Social security numbers used for identification

0%

2.

Some system depended identifiers and social


security numbers

50%

3.

System depended identifiers and no social


security numbers

100%

To present an intuitive overview of the operational privacy assessment results calculated by


our method, and to enable a comparison of the
different results of the categories, we suggest displaying the results using a Kiviat graph defined by
Kolence and Kiviat (1973), where each category
is represented by an axis. The chart displays the
strength and weaknesses of the system tested.
High privacy protection is placed at the edge of
the chart and low privacy protection is placed close
Table 6. Standard transparency classes

The assessed systems score with respect to


these different classes is determined through a
questionnaire, which is completed by a person
who is familiar with the different aspects of the
development and operation of the system. An
example of the first page of a questionnaire is
shown in Figure 1.

Table 5. Standard ordinary use classes

Purpose
1.

There is not given information about purpose


of system

0%

2.

There is not given information about purpose


of system. But there is access to some part of
the data.

25%

3.

There is not given information about purpose


of system. But there is access to all of the data.

50%

4.

There is given information about purpose of


system. And there is access to some part of the
data.

75%

5.

There is given information about purpose of


system. And there is access to all of the data.

100%

Education
1.

No information about privacy issues

0%

2.

Warning about system containing sensitive


data

33%

3.

Informed about how to handle sensitive data

66%

4.

Educated in handling sensitive data

100%

Access control

Contacts
1.

No official way to contact system owner

0%

2.

It is possible to contact system owner, but this


is not public knowledge

50%

3.

Official and public known ways to contact


system owner

100%

Information of rights

1.

Open system full access to data

0%

2.

Closed systems were some users have access


to all data

33%

3.

Some segregation of duties, where access is


granted depending on job description

66%

4.

Full segregation of duties, including geographical segregation

100%

1.

No information about information rights

0%

2.

The simple rights are available

50%

3.

The rights are available and explained in a


decent manner

100%

Information of data
1.

It is not possible to see registered data or know


how it is processed

0%

User storage



1.

Extraction data is possible without requirement for encryption

0%

2.

It is possible to see registered data, but not to


know how it is processed

33%

2.

Extraction data is possible with requirement


for encryption

50%

3.

It is possible to see registered data, and know


some about how it is processed

66%

3.

No extraction of data is possible

100%

4.

It is possible to see registered data, and know


how it is processed

100%

Practical Privacy Assessments

Figure 1. First page in a privacy assessment questionnaire



Practical Privacy Assessments

Figure 2. Example of a Kiviat graph from a fictitious system

Table 7. Standard control classes


Registered individuals control
1.

No control of the data

0%

2.

The process is described but no control

33%

3.

Parts of the process are controllable

66%

4.

All parts of the process are controllable

100%

Data correctness
1.

Not possible to check

0%

2.

Only some administrators can see and


correct the data

33%

3.

All administrators can see and correct


the data

66%

4.

Individuals can see and correct the data

100%

1.

No audit

0%

2.

Occasional audit

33%

3.

Internal audit by official schedule

66%

4.

Independent audit by official schedule

100%

Audit

to the centre of the chart. When the scores are


connected a heptagon is formed, and the area of
the graph depends on the total score with a small
area showing high privacy risk and a larger area
showing lesser privacy risks.
Figure 2 shows a fictional government IT
system. It shows that the privacy protection of the
system is strong in the categories environment,
ordinary use, and control; the system has average
scores in surveillance and data protection; and
the areas sensitivity and transparency have weak
privacy protection. These final two areas would
possibly benefit from further work to improve
their privacy level.

prIvAcy Assessment
evAluAtIon
When the privacy protection instruments of a
given system is evaluated, it is important to examine both the quality of the privacy enhancing
technologies employed in the system and the



operational environment of the system, which


includes privacy legislation, system operation, and
education of the managers and end-users of the
system. The privacy assessment method presented
in this paper includes all of these factors.

prIvAcy Assessment
methodology
Producing the assessment result using the model
means that for each of the seven categories in
the model, the components must be assigned a
percentage value as described in the fifth section.
The specific values are found by establishing the
amount of work done to protect privacy in a scale
between no work and all that is possible. Such
values can be found through setting a number of
possible levels of work and assigning each level
a value. For example, to determine the amount
of work done to protect privacy through audits,
in the category control, four possible levels can
be used (see Box 1).
When determining which level is achieved
by the system, the lowest denominator has precedence, for example, an occasional independent
audit will only score 33%. Completing these
evaluations for all categories and their components
produces the values required to calculate the cat-

Practical Privacy Assessments

Box 1. Privacy through audit


1.
2.
3.
4.

No audits
Occasional audit
Internal audits by official schedule
Independent audits by official schedule

egory scores and the Kiviat graph as described


in the fifth section.
Realizing the full benefit of a completed evaluation is an important two-step process, where the
first step identifies the areas of potential privacy
improvements and the second step further analyzes privacy risks and identifies more specific
privacy improvements.
Initially, an evaluation using the model will
result in a Kiviat graph visualising privacy protection in a given system as illustrated by Figure
2. In this visualization, the dimensions with the
lowest scores are the most immediately interesting
as they represent the largest risks and therefore
the largest potential areas for improvements. It
is unlikely that all privacy risk areas can be improved, because the systems functional requirements may prevent some privacy improvements.
For example, a system requiring large amounts
of very sensitive data will receive a low score in
sensitivity, representing the high risk associated,
and will be unable to improve on that score. The
discovery of such areas is in itself valuable information that allows the system owners to counter
this risk using other precautions and/or raising
the demands in other areas. The results of the
privacy assessment must therefore be interpreted
in the context of the functional requirements of
the system and the limits these requirements
impose. The model does not seek to conclude on
the question of whether the systems benefits are
sufficient for the associated privacy risks to be
accepted, it merely seeks to determine and clarify
these privacy risks.
Once the overall assessment has been completed, more specific improvements can be
found by looking at the individual dimensions.

0%
33%
66%
100%

The dimensions with the lowest scores represent


the greatest areas of privacy threats, which are
often the most efficient areas to start with, but
are not necessarily the easiest which is why the
overall assessment in step one is important. For
each dimension, the specific issues that cause
the low score can be easily identified. It is for
these specific issues that the system owners and
developers should try to identify and implement
possible improvements. While the evaluation
presents the specific privacy issues in a system,
it does not provide the specific solutions. The
method of improving the individual solutions is
highly dependant on the type and structure of the
specific system. Any solution must therefore be
tailored to the specific system using industry approved methods and tools. This means that there
is a certain amount of work to be done finding an
optimal solution for each of the specific privacy
issues. Once all the issues of a dimension have
been examined, and improved if possible, the
work can continue to the next dimension having
a low score. This continues as long as time or
budgets allows or until all dimensions have been
adequately analyzed.

prIvAcy Assessment
scenArIos
In order to demonstrate the feasibility of the
proposed privacy assessment model, we have
defined two application scenarios, which illustrate
how our model may be used to assess the privacy
properties of these scenarios. In each scenario, we
provide a description of the setting and the immediate results of an assessment for each category



Practical Privacy Assessments

of the model. Also, the resulting Kiviat graph is


presented. After the two scenarios we discuss a
comparison of the results.

scenario 1
The first scenario defines a simple medical journaling system used by doctors and small clinics.
The data stored in the system is personal medical
information, which is by nature very sensitive. The
journaling system is integrated with the doctor
or small clinics existing IT system, which also
stores the data. The system does not encrypt data
but the door to the office is locked after opening
hours. The system is developed by a small development company and is only used by a handful
of doctors and clinics. The system uses social
security numbers to identify patients and stores
any personal data that the doctors find relevant.
The clerks at the medical clinic have no special
training regarding management of personal information and they have the same access to the
system as the doctor. The system allows doctors
to print individual patients journals so that they
can bring them on a house call or take them
home. Patients data are entered into the system
the first time they visit the clinic and updated at
subsequent visits. The data in the system does
not expire and there is no external audit of the
system. The patients can, upon request, receive
a copy of their data but they are not otherwise
informed about the contents or functions of the
system and the patient has no direct control of
the stored data.
Assessment Results: Using our model, we get
a specific result for each category which indicates
the level of privacy risks in each category.
Data protection: While the overall score in
this category is relatively high, the model shows
an evident opportunity for improvement by
encrypting the stored private data. The system
scores 25%.
Sensitivity: The category notes some very
large risks inherent in a system with this sort of

8

data. Also, the system is too small to properly


separate the identification data from the rest of the
set. A low score in the category warns that extra
care should be taken. The system scores 12%.
Environment: As a result of the closed development, the system is unable to achieve a top
score in this category. The system scores 33%.
Surveillance: The system scores low in this
category as a result of the unrestricted amount of
personal data, the high sensitivity of the data, as
well as the use of common identifiers such as social
security numbers. The system scores 0%.
Ordinary use: A bottom score in this category
reveals a lot of room for improvement. The clerk
has no training in handling personal data and there
are no levels of access to differentiate between
what data the different users can access. Also,
any security in the system is lost when the data
is printed or otherwise copied out of the system.
The system scores 8%.
Transparency: While the system is not a secret, it does little to actively inform the registered
persons. This makes it harder for the registered
person and thus lowers the score of the category.
The system scores 46%.
Control: A lack of system revision as well
as the registered individual having very little to
say in how and why the data is processed. The
system scores 22%.

scenario 2
This scenario is similar to Scenario 1, but with a
different system. The journaling system in this
scenario is developed by the IT branch of a large
medical company which also hosts and serves the
data from its central server farms. These server
farms securely stores the data and communicates
with the clinics using encrypted channels. The
system is able to share data with hospitals and
provides online access for the patients to their
data. The system uses social security numbers
to identify patients and stores any personal data
the doctors find relevant. The clerks and doctors

Practical Privacy Assessments

Figure 3. Privacy assessment of Scenario 1

using the system attend an introductory course


which gives information about management of
personal data. The system has different levels of
access for the doctors and the clerks. The system
synchronises items such as addresses and phone
numbers with government databases. Through
the online access, patients are able to browse their
personal data and contact the system owners.
Also, the system is regularly audited by government auditors.
Assessment results: Using our model, we get
a specific result for each category which indicates
the level of privacy risks in each category.
Data protection: The system has protection
of data both during storage and transport. While
this serves to reduce privacy risks, the model does
present options for additional improvements. The
system scores 50%.
Sensitivity: Due to the sensitive nature of the
data, there are significant privacy risks involved
in the system. The score of this category reflects
this fact. The system scores 16%.
Environment: This system has been developed with more input from external interest groups
as well as having been checked for compliance
with current law. However, the possible conflicts
of interest from the private firm hosting the data
subtracts from the final score. The system scores
44%.

Surveillance: As with the system described in


Scenario 1, the risks that stem from the amounts
of data and the common identifiers are considerable. The system scores 0%.
Ordinary use: The extra educational effort
combined with the role based access limits goes
a long way to reduce the risks in this category.
However, the possibility for removing data from
the security of the system still exists. The system
scores 58%.
Transparency: The system allows registered
people to access their data and contact their doctor or the system owner if something is wrong.
This gives the individual better information about
his or her privacy situation with respect to this
system and the data that it contains. The system
scores 75%.
Control: While the registered persons have no
control of the processing of their personal data,
the system still improves on privacy by ensuring
correctness of data as well as submitting to regular
official audits. The system scores 66%.

prIvAcy Assessment results


The privacy assessments of the two scenarios allow us to compare the privacy properties of the two
systems, by plotting the scores in a Kiviat graph.
Figure 3 and Figure 4 shows the plotted results of
the privacy evaluation. The interpretation of the
Kiviat graph is presented in the fifth section.
While the sensitivity, which is inherent in
medical systems, is a large factor in the large
overall level of privacy risks, there are important differences between the two systems. The
categories of sensitivity and surveillance can be
difficult to improve directly, which explains the
similar scores, but it is not impossible and it is
worth considering for both systems.
The system in Scenario 2 is somewhat more
professional in design and operation. The extra
effort results in a considerably improved privacy
protection compared to Scenario 1, in the catego-

9

Practical Privacy Assessments

Figure 4. Privacy assessment of Scenario 2

ries of data protection, control, transparency, and ordinary use. While some of these
improvements are technical solutions, such as
encryption and role based access control, many
are the result of improved policies, practises and
awareness. This includes improved training,
regular audits, and the effort to inform the registered individuals of what is going on. While these
improvements can be more difficult to assess than
technical solutions, such as the use of role based
access control, they may prove equally efficient
and help to significantly reduce privacy risks when
technical solutions are not possible.
The comparison reveals that scenario two is
in most areas superior to scenario one from a
privacy perspective. While some of this is based
on technical solutions that may not be easily adaptable for scenario one, many of the more practical
are. Transparency, ordinary use, and control are
areas where scenario one could benefit by learning from scenario two.
While the system describe in scenario one is
likely to be cheaper, it comes with an extra cost
of poor privacy protection.

future trends
Most people are, in one way or another, concerned
about their privacy, this is especially true amongst

80

Internet users as indicated by Reips (2007). This


concern, however, is not easily translated into
specific demands for privacy protection until after
mayor privacy breaches become known. Therefore, we do not see demands of privacy coming
from the users of government systems, but from
politicians and companies that feel a need to avoid
scandals caused by the loss of personal data. The
reason for this is probably that most people do
not see the risks, because they lack understanding of the privacy issues and their risks. Most
people only know about privacy risks and loss
of personal data from the newspapers, but once
data breaches are known, demands for privacy
enhancing technologies emerge.
The threats against privacy are also likely
to increase. This is probably best illustrated by
the latest anti-terror legislation by The Congress
of the United States (2001) and The English
Parliament (2006). The fear of terror has also
prompted legislators to introduce new biometric
passports and to require passenger information,
including credit card details, for all passengers
in commercial airlines. Finally, the European
Parliament has published a report by Walton
(2006) on the existence of a global system for
the interception of private and commercial communications (ECHELON interception system),
which also increase surveillance and thereby the
threats against privacy.
Not all new privacy enhancing technologies
are easily implemented in government systems,
but the method defined by our operational privacy
assessment model gives an overview of the current privacy risks and security level. This can be
used to highlight the issues relating to privacy in
the future development of government systems,
which may help to make them more privacy
friendly. Furthermore, it may form the foundation for further research into what privacy risks
are common and should be the focus for further
privacy research.

Practical Privacy Assessments

conclusIon
Privacy is a subjective concept, so its definition
and importance will vary from person to person.
The model presented in this chapter helps to standardise the work of securing privacy in electronic
systems. The model contains seven categories
that together cover all aspects of privacy. Each
category clarifies a range of questions concerning privacy and the model produces a simple
objective result in the form of a score system.
The score system makes it possible to assess the
overall privacy level in a system and to compare
the system to other similar systems. The score in
each category indicates the level of privacy risk
within that category, which helps the developers
and administrators of government IT systems
to identify the privacy factors that should be
addressed first. The score relates to the current
state of privacy in the system, but it may also
help determine how well the system tested may
address future privacy problems, for example, if
the system has a low score in sensitivity because
the information that it manages is highly sensitive,
there is little hope for a better score in the future.
The standardisation and the overview of privacy
risks provided by the model, serve to help system owners understand the privacy risks in their
systems as well as help the individuals, whose
private data is being processed, to understand
their personal privacy situation. Furthermore, the
model addresses all the technical and operational
aspects which influence privacy in a system. The
model has been evaluated in a few real systems
by Jansen and Peen (2007), but we would like
to analyse a whole sector within a government
administration, in order to demonstrate the
general applicability of our model. This analysis
would also provide valuable information about
the general state of privacy in the government
sector. The proposed model focuses on government IT systems, which are governed by privacy
legislation and there are few direct motives for
civil servants to ignore privacy laws. The privacy

issues in the private sector are slightly different


because private companies have financial interests
in the personal data that they collect, so further
studies are needed to determine whether it is possible to include this aspect into the model without
compromising its usability.

future reseArch dIrectIons


The privacy model described in this chapter defines a basis for a systematic approach to working
with privacy. The current definition of the model
is very general and it may not capture all aspects
of privacy in a given application. However, working with the model in practice and, in particular,
performing assessments of existing government
IT systems will help refine the model so that it
may serve as the basis of a privacy certification
scheme. Ideally, we would hope for a single general
scheme, but we believe that it is more likely that
a number of schemes may emerge, which would
be limited to certain genres of systems, such as
e-commerce. The narrower the genre of system
examined with the model, the easier it may be to
find an appropriate definition of privacy in that
context. An interesting idea would be to attempt
to extend common criteria certifications with elements of the model described in this chapter. As
the current model is already based on the structure
of the common criteria, this would be an obvious
and valuable extension of the work presented in
this chapter. The advantage of certification is that
systems with the label privacy friendly would
help focus the publics attention on privacy.
A more general problem that needs to be addressed is that privacy has different meanings
for different people. Moreover, the definition of
privacy seems to change when people are forced
to consider the issues involved, so the meaning
of privacy may actually change when observed.
It would therefore be interesting to solve this
problem and come up with a generally acceptable
definition of the privacy concept. Questionnaires

8

Practical Privacy Assessments

and interviews might help determine the value


that people intuitively put on privacy and how
this value changes when they are asked to think
about problems relating to privacy. This approach
could also be used to determine the ability and
willingness of people to manage their own privacy,
which determines whether an approach based on
informed consent, is actually useful. This study
would also help develop new tools that allow
people to control the keys to their own data in
a non-intrusive and intuitive way. There are a
lot of questions to answer in this area, which is
typical when it comes to working with privacy,
but it could be interesting to see if these questions
could lead to people defining their own ideas of
privacy and to define a framework for building
IT systems that incorporate these personal definitions of privacy when choosing how to work
with private data.

references
Blaze, M., Feigenbaum, J., & Lacy, J. (1996).
Decentralized trust management. In Proceedings
1996 IEEE Symposium on Security and Privacy
(pp. 164-173).
Chaum, D. (1988). The dining cryptographers
problem: unconditional sender and recipient untraceability. Journal of Cryptology, 1(1), 65-75.
Congress of the United States. (2001). USA PATRIOT ACT of 2001. Retrieved July 16, 2007, from
http://thomas.loc.gov/cgi-bin/bdquery/z?d107:
H.R.3162

24 October 1995 on the protection of individuals


with regard to the processing of personal data and
on the free movement of such data.
European Parliament. (2001). Report on the existence of a global system for the interception of
private and commercial communications (ECHELON interception system) (2001/2098(INI)).
European Union and United Stats of America.
(2007). Agreement between the European Union
and the United States of America on the processing
and transfer of passenger name record (PNR) data
by air carriers to the United States Department
of Homeland Security.
Ferraiolo, D., & Kuhn, R. (1992). Role-based
access controls. In Proceedings of the 15th NISTNCSC National Computer Security Conference
(pp. 554-563).
Financial News Online U.S. (2007). JPMorgan
client data loss. Story attributed to the Wall Street
Journal, reported on Financial News Online U.S.
on May 1, 2007.
International Organization for Standardization.
(1999). Common criteria for information technology security evaluation (ISO IS 15408). Retrieved
on July 12, 2007, from http://www.commoncriteriaportal.org/
International Organization for Standardization.
(2006). Information technologysecurity techniquesencryption algorithmspart 2: Asymmetric ciphers (ISO/IEC 18033-2).

English Parliament. (2006). Terrorism Act 2006,


Queens Printer of Acts of Parliament, UK.

Jansen, T. W., & Peen, S. (2007). Privacy i offentlige systemer. Masters thesis, Informatics and
Mathematical Modelling, Technical University of
Denmark (in Danish).

European Opinion Research Group. (2003). Data


Protection, Special Euro barometer 196, Wave
60.0.

Kolence, K. W., & Kiviat, P. J. (1973). Software


unit profiles & Kiviat figures. ACM SIGMETRICS
Performance Evaluation Review, 1973(2), 2-12.

European Parliament. (1995). Directive 95/46/EC


of the European Parliament and of the Council of

Lederer S., Mankoff, J., & Dey, A. (2003). Towards a deconstruction of the privacy space. In

8

Practical Privacy Assessments

Proceedings of the Ubicomp communities: privacy


as boundary negotiation Workshop on Privacy
at Ubicomp2003.
Pfitzmann, A., & Khntopp, M. (2000). Anonymity, unobservability, and pseudonymitya
proposal for terminology. In H. Federrath (Ed.),
Workshop on Design Issues in Anonymity and
Unobservability. Springer Verlag.
Privacy Rights Clearinghouse. (2007). A chronology of data breaches. Retrieved on May 14, 2007,
from http://www.privacyrights.org/ar/ChronDataBreaches.htm
Reips, U.-D. (2007). Internet users perceptions
of privacy concerns and privacy actions. International Journal of Human-Computer Studies
65(6), 526-536.
Rivest, R. L., Shamir, A., & Adleman, L. A.
(1978). A method for obtaining digital signatures
and public-key cryptosystems. Communications
of the ACM, 21(2), 120-126.
Sandoval, G. (2006). Veterans data swiped in
theft. CNET News.com. Story last modified Mon
May 22 16:55:51 PDT 2006.
Smith, R. E. (2006). Laptop hall of shame. Forbes.
com. Commentary on Forbes.com September 7,
2006.
Stanton, P. (2004). Securing data in storage: A review of current research. CoRR,
cs.OS/0409034.
Walton, R. (2006). Balancing the insider and
outsider threat. Computer Fraud and Security
2006(11), 8-11.
Yu, E., & Cysneiros, L. (2002). Designing for
privacy and other competing requirements. In
Proceedings of the 2nd Symposium on Requirements Engineering for Information Security
(SREIS-02).

system (Technical Report CUCS-021-98). Columbia University: Computer Science Department.


Zimmermann, P. R. (1995). The official pgp users
guide., Cambridge MA: The MIT Press.
Additional Reading
American Civil Liberties UnionACLU. (2007).
Privacy section of their web site. Retrieved July
12, 2007, from http://www.aclu.org/privacy
Anderson, R. (1996). The eternity service. In
Proceedings of the 1st International Conference on the Theory, Applications of Cryptology,
PRAGOCRYPT96.
Brands, S. (2000). Rethinking public key infrastructures and digital certificates. Cambridge,
MA: The MIT Press.
Chaum, D. (1981). Untraceable electronic mail,
return addresses, and digital pseudonyms. Communications of the ACM, 24(2), 84-88.
Chaum, D. (1985). Security without identification: Transaction systems to make big brother
obsolete. Communications of the ACM, 28(10),
1030-1044.
Chaum, D., Fiat, A., & Naor, M. (1990). Untraceable electronic cash. In S. Goldwasser (Ed.), Advances in Cryptology CRYPTO88 (pp. 319-327).
Springer Verlag.
Clarke, I., Sandberg, O., Wiley, B., & Hong, T.
W. (2000). Freenet: A distributed anonymous
information storage and retrieval system. In H.
Federrath (Ed.), Workshop on Design Issues in
Anonymity and Unobservability (pp. 4666).
Springer Verlag.
Dingledine, R., Freedman, M. J., & Molnar,
D. (2000). The free haven project: Distributed
anonymous storage service. In H. Federrath (Ed.),
Workshop on Design Issues in Anonymity and
Unobservability (pp. 6795). Springer Verlag.

Zadok, E., Badulescu, I., & Shender, A. (1998).


Cryptfs: A stackable vnode level encryption file

8

Practical Privacy Assessments

Electronic Frontier Foundation. (2007). Privacy


section of their web site. Retrieved July 12, 2007,
from http://www.eff.org/Privacy
Electronic Privacy Information CenterEPIC
(2007). Privacy web site. Retrieved July 12, 2007,
from http://epic.org
European Union. (2006). Directive 2006/24/EC
of the European Parliament and of the Council of
15 March 2006 on the retention of data generated
or processed in connection with the provision of
publicly available electronic communications
services or of public communications networks
and amending Directive 2002/58/EC.
Goldberg, I. (2000). A pseudonymous communications infrastructure for the internet. Ph.D. Thesis,
University of California, Berkeley.
Goldberg, I. (2003). Privacy-enhancing technologies for the Internet, II:Five years later. In G. Goos,
J. Hartmanis, & J. van Leeuwen (Eds.), Second
InternationalWorkshop on Privacy Enhancing
Technologies (PET 2002) (pp. 209-213). Springer
Verlag.
Goldberg, I., Wagner, D., & Brewer, E. (1997).
Privacy-enhancing technologies for the Internet.
In Proceedings of IEEE COMPCON97 (pp. 103110). IEEE Computer Society Press.

Hiding Workshop 1999 (pp. 434-447). Springer


Verlag.
Waldman, M., Rubin, A., & Cranor, L. F. (2000).
Publius: a robust, tamper-evident, censorshipresistant and source-anonymous web publishing
system. In Proceedings of the 9th UsenixSecurity
Symposium (pp. 5972).

endnotes
1

Privacy Commissioner of Canada. (2000). The


Personal Information Protection and Electronic
Documents Act.
Reiter, M., & Rubin, A. (1999). Anonymous web
transactions with crowds. Communications of the
ACM, 42(2), 32-48.
Stajano, F., & Anderson, R. (1999). The cocaine
auction protocol: On the power of anonymous
broadcast. In A. Pfitzmann (Ed.), Information

8

5
6

Healthcare is an example of a service which


has primarily been offered by the public
sector, but where the private sector is playing an increasing role in many European
countries.
It is interesting to note that library records
of ordinary citizens are among the types of
data monitored in the hope to identify and
apprehend potential terrorists before they
commit any terrorism.
A numbered Swiss bank account is an example of a pseudonymous financial service,
but most governments wish to limit the availability of such services because they can be
used by criminals to hide the proceeds from
their crime.
Examples of NGOs that are active in the
field of privacy are: The American Civil
Liberties Union (ACLU), the Electronic
Frontier Foundation (EFF), Electronic
Privacy Information Center (EPIC), and
Privacy International.
Secure Sockets Layer, RFC 2246
Definition from Common Criteria issued by
International Organization for Standardization (1999).

8

Chapter V

Privacy and Trust in Online


Interactions
Leszek Lilien
Western Michigan University, USA
Bharat Bhargava
Purdue University, USA

AbstrAct
Any interactionfrom a simple transaction to a complex collaborationrequires an adequate level of
trust between interacting parties. Trust includes a conviction that ones privacy is protected by the other
partner. This is as true in online transactions as in social systems. The recognition of the importance of
privacy is growing since privacy guarantees are absolutely essential for realizing the goal of pervasive
computing. This chapter presents the role of trust and privacy in interactions, emphasizing their interplay. In particular, it shows how ones degree of privacy can be traded for a gain in the level of trust
perceived by the interaction partner. After a brief overview of related research, the idea and mechanisms
of trading privacy for trust are explored. Conclusions and future trends in dealing with privacy and trust
problems complement the chapter.

IntroductIon
Any interactionfrom a simple transaction to
a complex collaborationcan be successful
only if an adequate level of trust exists between
interacting entities. One of the more important

components of trust of an entity in its interaction


partner is its reliance that the partner is both willing and able to protect entitys privacy. This is as
true in the cyberspace as in social systems.
The need for privacy is broadly recognized
by individuals, businesses, the government, the

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy and Trust in Online Interactions

computer industry, and academic researchers.


Examples are shown in Table 1. The growing recognition of the importance of privacy is motivated
not only by users sensitivity about their personal
data. Other factors include business losses due
to privacy violations, and enactments of federal
and state privacy laws. Even more important, the
quest for the promised land of pervasive computing will fail if adequate privacy guarantees are
not provided.
The role of trust and privacy is fundamental in
social systems as well as in computing environments. The objective of this chapter is presenting
this role in online interactions, emphasizing the
close relationship between trust and privacy. In
particular, we show how ones degree of privacy
can be traded for a gain in the level of trust perceived by ones interaction partner.
We begin with a brief overview in the next
section, presenting the background for research on

trust, privacy, and related issues. First, we define


trust and privacy, and then discuss their fundamental characteristics. Selecting the most relevant
aspects of trust and privacy to be employed in a
given application and computing environment is
in and by itself a significant challenge. The reason
is that both trust and privacy are very complex,
multifaceted concepts.
Privacy and trust in computing environments are as closely related and as interesting in
various aspects of their interplay as they are in
social systems (Bhargava, Lilien, Rosenthal, &
Winslett, 2004). On the one hand, a high level of
trust can be very advantageous. For example, an
online seller might reward a highly trusted customer with special benefits, such as discounted
prices and better quality of services. To gain
trust, a customer can reveal private digital credentialscertificates, recommendations, or past
interaction histories. On the other hand, a mere

Table 1. Recognition of the need for privacy by different entities


Recognition of the need for privacy by individuals (Cranor, Reagle, & Ackerman, 1999)
99% unwilling to reveal their SSN
18% unwilling to reveal their favorite TV show
Recognition of the need for privacy by businesses
Online consumers worrying about revealing personal data held back $15 billion in online revenue in 2001 (Kelley, 2001)
Recognition of the need for privacy by the federal government
Privacy Act of 1974 for federal agencies (Privacy Act, 2004)
Health Insurance Portability and Accountability Act of 1996 (HIPAA) (Summary HIPAA, 2003; Mercuri, 2004)
Recognition of the need for privacy by computer industry research (examples)
IBMincl. Privacy Research Institute (IBM Privacy, 2007)
Topics include: pseudonymity for e-commerce, EPA, and EPALenterprise privacy architecture and language, RFID privacy,
privacy-preserving video surveillance, federated identity management (for enterprise federations), privacy-preserving data mining
and privacy-preserving mining of association rules, hippocratic (privacy-preserving) databases, online privacy monitoring
Microsoft Researchincluding Trustworthy Computing Initiative (Trustworthy Computing, 2003)
The biggest research challenges: reliability/security/privacy/business Integrity
Topics include: DRMdigital rights management (incl. watermarking surviving photo editing attacks), software rights protection,
intellectual property and content protection, database privacy and privacy-preserving data mining, anonymous e-cash, anti-spyware
Recognition of the need for privacy by academic researchers (examples)
Trust negotiation with controlled release of private credentials, privacy-trust tradeoff
Trust negotiation languages
Privacy metrics
Anonymity and k-anonymity
Privacy-preserving data mining and privacy-preserving database testing
Privacy-preserving data dissemination
Preserving location privacy in pervasive computing, and privacy-preserving location-based routing and services in networks
Trust negotiation with controlled release of private credentials
Genomic privacy

8

Privacy and Trust in Online Interactions

perception of a privacy threat from a collaborator may result in a substantial lowering of trust.
In particular, any sharing of an entitys private
information depends on satisfactory limits on its
further dissemination, such as a partners solid
privacy policies. Just a privacy threat can impede
sharing of sensitive data among the interacting
entities, which results in reduced effectiveness of
the interaction and, in the extreme cases, even in
termination of the interaction. For instance, a user
who learns that an Internet service provider (ISP)
has carelessly revealed any customers e-mail will
look for another ISP.
The idea and mechanisms of trading privacy
for trust, the main topic of this chapter, are explored in the following section. It categorizes
types of privacy-for-trust tradeoff, and shows
how to exchange ones privacy for trust in an
optimal way. The remaining sections, in turn,
present our view of future trends in research on
privacy and trust; include conclusions; present
future research directions for privacy and trust
in computing; include references; and suggest
additional reading material that can supplement
the topics of this chapter.

bAckground: trust, prIvAcy,


And relAted Work
The notions of trust and privacy require an indepth discussion of their background, including
their interplay. It is provided in this section.

trust and Its characteristics


Definition of Trust
We define trust as reliance on the integrity, ability,
or character of a person or thing (The American,
2000). Use of trust is often implicit. Frequently, it
is gained offline (Bhargava et al., 2004). A user,
who downloads a file from an unfamiliar Web
site, trusts it implicitly by not even considering

trust in a conscious way. A user who decides to


buy an Internet service from an Internet service
provider may build her trust offline by asking her
friends for recommendations.
Each entity E should have a good reason to
expect that its interaction partner is both willing
and able to protect Es privacy. This indicates
that dimensions of trust include integrity and
competence of a trusted party. That is, the integrity dimension of trust is a belief by a truster
that a trustee is honest and acts in favor of the
truster, and the competence dimension of trust is a
belief in a trustees ability or expertise to perform
certain tasks in a specific situation. Predictability
can be attached as a secondary measure to both
an integrity belief and a competence belief (Zhong,
Lu, Bhargava, & Lilien, 2006).

Implicit and Explicit Trust


Trust is a powerful paradigm, truly ubiquitous and
beneficial in social systems, that enables smooth
operation of such systems, also under conditions
of uncertainty or incomplete information. Trust
has been comprehensively used and well tested.
The need for trust exists in all interactions, irrespective of the fact whether the parties involve
individuals, institutions, or artifacts. For example,
trust is constantlyif often unconsciouslyapplied in interactions between people and animals
(e.g., a guide dog), or people and artifacts (e.g.,
Can I rely on my car for this long trip?).
Trust has to be approached differently in closed
and open systems. In the former, trustworthiness
of an interaction partner is known to the initiator of an interaction before the interaction starts,
and in the latter it is not known. An example of a
closed social system is a small village where people
know each other (or at least know each others
reputations). Trust is used implicitly since each
villager has a rather good clue what to expect of
everybody else. In short, X feels how much to
trust Y. An example of an open social system is
a large city where trust must be used explicitly to

8

Privacy and Trust in Online Interactions

avoid unpleasant surprises (such as being harmed


by a dishonest or incompetent car mechanic or
dentist). A city dweller needs to ask around to
find a trusted entity she needs, inquiring friends,
office mates, and so forth. She can also inquire
professional reputation databases, such as the
Better Business Bureau (BBB), or the AAAs
Approved Auto Repair Network.
Trust has proven its usefulness in social systems. We need similarly ubiquitous, efficient, and
effective trust mechanisms in the cyberspace.
We have both closed computing systemssuch
as a LAN serving a research laband opened
computing environmentssuch as the World
Wide Web or WiFi hot spots. Only the latter include users who are not known in advance to the
system. For example, an access control subsystem
for a WiFi hot spot must determine the permitted
actions for each user, also a new and completely
unknown user.
We believe that many users or computer systems err by not considering trust issue at all. They
do not assume trust implicitly. They simply ignore
the issue of trust. Without even knowing it, they
trust blindly, without any evidence, justification,
or verification. Such a mistake is made also by
any operating system that trusts all application
programs, allowing any programincluding
malwareto run. As another example, too many
users do not even know that they show a nave
trust by accessing unknown Web site, which can
harm them or their computers.
Still, closed computing environments systemsanalogous to a small villagehave been
working well without applying the notion of trust,
at least explicitly. However, it becomes more
and more difficult to handle open computing
systemsanalogous to a big citywithout the
assistance from the powerful trust paradigm. In
the security area, for example, the confidentialityintegrity-availability (CIA) paradigm has served
sufficiently well in closed systems but it has to be
replaced or augmented with trust-based solutions
in open environments. Using the trust paradigm

88

can simplify security problems by reducing complexity of interactions among system components,
both human and artificial ones.

Selected Trust Characteristics


Trust is a very complex and multifaceted notion.
A researcher wishing to use trust in computing
systems must cope with the challenging choice
of the optimal subset of trust characteristics. A
vast variety of different trust-based systems can
result from selecting different subsets. Only some
of the choices will make systems based on them
effective and efficient.
Some of the choices for trust characteristics
include the following:
1.

2.

3.

Symmetric vs. asymmetric trust: Symmetric trust assumes that X trusts Y implies
Y trusts X, which is not true in general.
Asymmetric trust does not assume such
implication and is, therefore, more general.
Symmetric trust can be viewed as its special case, which can be chosen only in very
special circumstances or applications.
Gradual vs. binary trust: The former allows for degrees of trust, and can be defined
on a multilevel or a continuous scale. The
latter is an all-or-nothing proposition, which
forces to specify a single trust threshold
above which full trust can be assumed. Binary trust, as a special case of gradual trust,
is insufficient in general. It can be assumed
only for very special and limited settings.
Explicit vs. implicit trust: Implicit trust is
used by either ignorant or nave interaction
parties. For instance, a user, who downloads
a file from an unfamiliar Web site, trusts it
implicitly by not even considering trust in
a conscious way. The consequences might
include penetration by malware.
Explicit trust allows for its clear specification, assuring that trust considerations are
not ignored. Given Xs need for determin-

Privacy and Trust in Online Interactions

4.

5.

ing trustworthiness of Y, only explicit trust


allows for determination of the party that
vouches for trustworthiness of Y, and assumes risks when this trust is breached. It
could, but does not have to be the case, that
Y vouches for its own trustworthiness (e.g.,
via its behavior in earlier interactions with
X).
Explicit trust might be gained offline. For
instance, a person who decides to buy an
Internet service from an Internet service
provider (ISP) may build her trust offline by
asking her friends for trustworthy ISPs.
Direct vs. indirect trust: Direct trust between X and Y, when X trusts Y, is limited
to cases when X has gained a degree of trust
in Y from previous interactions. (This may,
but does not have to, mean that Y gained
any degree of trust in X).
It is obvious that the domain of trust can be
significantly extended by relying not only
on direct trust but also on indirect trust. For
indirect trust, X does not need to trust Y to
be willing to interact with it. It is sufficient
that X finds an intermediary Z such that X
has a sufficient degree of trust in Z and Z
trusts Y. (To be more precise, in this case
X needs to trust to a sufficient degree in Zs
recommendations about trustworthiness of
Y).
Z becomes a trusted third party (TTP). A
TTP can be any entity accepted by X, in
particular, it can be an institution set up to
provide indirect trust, also on a commercial
basis. Examples of such institutions are
certification bodies of all kinds, including
providers of digital certificates.
Type of trusted entities: Should trust be
lavished only on humans? We believe that
the answer should be no. We trust our cars,
refrigerators, cellphones, PDAs, or RFID
tags in stores. As is the case with humans,
this trust can be breached if the devices are

6.

7.

disloyal, or loyal to other parties than their


owners or primary users. Loyalty decides
who the entrusted party works for. For example, sensors and recorders in a car can
work not for the driver but for an insurer, a
browser can work for a commercial advertiser, and a sensor network in ones house
can be hijacked by a nosy neighbor orin
the worst caseby the Big Brother.
Number of trusted entities: The most
critical distinction is between trusting somebody or trusting nobody. The latter leads to
paranoid behaviors, with extremely negative consequences on system performance,
including exorbitant costs.
We believe that You cannot trust everybody
but you have to trust somebody. Trusting
more partners improves performance as
long as trust is not abused. Any breach of
trust causes some performance penalties. An
optimal number of trusted entities should be
determined.
Responsibility for a breach of trust: If
no TTP is involved, is the truster or the
trustee responsible for deciding on the
degree of trust required to offer or accept
a service? As a consequence, is the truster
or the trustee ultimately responsible for
bearing consequences of possible breaches
of trust? In commercial relationships, most
often a buyer determines whether the seller
is trustworthy enough and thenat least
once the warranty period is overbears
the possible costs of broken trust. There are,
however, cases when it is the seller who pays
for abuses by the buyer (as in the case when
terrorists are not prevented from boarding
a plane).
If a TTP is involved in a trust relationship,
it may be held responsible to the extent allowed by its legal obligations.

89

Privacy and Trust in Online Interactions

Caveats

Selected Privacy Characteristics

A few words of caution are in order (Bhargava


et al., 2004). First, using a trust model too complex for an application domain (i.e., including
superfluous trust aspects) hurts flexibility or
performance. Second, excessive demands for
evidence or credentials result in laborious and
uncomfortable trust-based interactions, while
insufficient requirements make them too lax. (In
the latter case, who wants to befriend someone
who befriends crooks and thieves?). Third, exaggerating the need for explicit trust relationships
hurts performance. For example, modules in
a well-integrated (hence, closed) system should
rely on implicit trust, just as villagers do. Also,
in a crowd of entities, only some communicate
directly, so only they need to use trustbut even
not all of them need to use trust explicitly.

Privacy has three dimensions: (a) personal


privacy of an entitydemanding protecting an
entity against undue interference (such as physical searches) and information that violates moral
sense of the entity; (b) territorial privacycalling
for protection of the area surrounding the entity,
such as laws on trespassing; and (c) informational privacyrequiring protection of gathering,
compilation and dissemination of information
(Fischer-Hbner, 2001).
Any interaction involves exchange of data.
It is hard to find any data that (at least in conjunction with other data, also offline data) does
not carry any private information on its sender.
Hence, informational privacy is endangered by
each interaction involving release or dissemination of data.
The release of sensitive data can be controlled
in various degrees: from none to full control. It can
also be categorized as voluntary, pseudo-voluntary, or mandatory (incl. the case of information
release as required by law). The pseudo-voluntary
data dissemination is particularly deceitful since
it appears to give a user a freedom to decline
sharing his private informationbut only at the
cost of denying the user an access to a desirable
service. As a simple example, a person who refuses
for privacy reasons (including fears of receiving
more spam) to enter the e-mail address on a Web
site can be denied the sites services. Quite often,
in the name of a real need or just a convenience
the user is forced or pressured to provide private
data. (This is a tradeoff between privacy and
convenience that should be studied).
The amount of privacy lost by disclosing a
piece of information is affected by the identity of
the recipients of this information, possible uses of
this information, and related private information
disclosed in the past. First, the recipients of private information include not only direct but also
all indirect recipients, who receive some of this
private information from entities other than the

privacy and Its characteristics


Definition of Privacy
We define privacy as the right of an entity
(normally a person), acting in its own behalf, to
determine the degree to which it will interact with
its environment, including the degree to which
the entity is willing to share information about
itself with others (Internet Security, 2007). We
fully embrace the possibilityindicated by the
words an entity (normally a person)to extend
the scope of the notion of privacy from a person
to an entity. The latter may be an organization,
an artifact (software in particular), and so forth.
The extension is consistent with the use of the
notion of trust also in relationship to artifacts
(The American, 2000), and with the common
practice of antropomorphization of intelligent
system components (such as objects and agents)
in computer science. The extension is useful for
discussion of privacy not only for humans but also
for artificial entities (acting, more or less directly,
on behalf of humans).

90

Privacy and Trust in Online Interactions

user. For example, a doctor, the direct recipient of


private patients information, passes some of this
information to an insurer, an indirect recipient.
Any indirect recipient can disseminate information further. In our example, the insurer can pass
some information to users employer. Second,
possible uses of information vary from completely
benevolent to the most malicious ones, with the
latter including the most painful case of identity
theft. Third, related private information disclosed
in the past has a life of its own, like a genie let
out of the bottle. At best, it is limited only by the
controls that its owner was able to impose on its
dissemination, for example, asking a company
not to sell to or share it with other businesses. At
worst, it can be retrieved and combined with all
pieces of information about the owner, destroying
much of owners privacy.

Threats to Privacy

forwarder, or receiver; (ii) threats to anonymity


of service provider; and (iii) threats to privacy
of communication (e.g., via monitoring, logging, and storage of transactional data). In the
third category, threats to privacy at the system
level are due to attacks on the system in order to
gain access to its data. For example, attacks on
access controls can allow the attacker breaking
into confidential databases. Finally, in the fourth
category, threats to privacy in audit trails are due
to the wealth of information included in system
logs and audit trails. A special attention should
be directed to consider logs and trails that gained
an independent life, away from the system from
which they were derived.
Another view of threats to privacy (FischerHbner, 2003) categorizes the threats as:
1.
2.
3.

Threats to privacy can be classified into four


categories (Fischer-Hbner, 2003):
1.
2.
3.
4.

Threats to privacy at the application level


Threats to privacy at the communication
level
Threats to privacy at the system level
Threats to privacy in audit trails

In the first category, threats to privacy at


the application level are due to collection and
transmission of large quantities of sensitive data.
Prominent examples of these types of threats
are large projects for the information highway,
including large peer-to-peer systems (Can, 2007)
or implementations of applications for public administration networks, health networks, research
networks, electronic commerce, teleworking,
distance learning, and so forth. In the second
category, threats to privacy at the communication
level include risks to anonymity of communication, such as: (i) threats to anonymity of sender,

4.

5.

6.

Threats to aggregation and data mining


Threats due to poor system security
Government-related threats. They arise
in part because the government has a lot
of peoples most private data (incl. data
on taxes, homeland security, etc.) and it is
difficult to strike the right balance between
peoples privacy on the one hand and homeland security concerns on the other hand.
Threats due to use of Internet, for example,
intercepting of unencrypted e-mail, recording of visited Web sites, and attacks via
Internet
Threats due to corporate rights and business practices. For instance, companies in
the United States may collect data that even
the federal government is not allowed to
gather.
Threats due to many traps of privacy for
sale, that is, temptations to sell out ones
privacy. Too often online offers that seem
to be free are not really free since they require providing the benefactor with ones
private data. An example is providing ones
data for a free frequent-buyer card.

9

Privacy and Trust in Online Interactions

Escalation of Threats to Privacy in


Pervasive Computing
Pervasive computing exacerbates the privacy
problem (Bhargava et al., 2004). People will be
submerged in an ocean of zillions of computing
devices of all kinds, sizes, and aptitudes (Sensor
Nation, 2004). Most of them will have limited or
even rudimentary capabilities and will be quite
small, such as RFID tags and smart dust. Most
will be embedded in artifacts for everyday use,
or even within human bodies, with possibilities
for both beneficial and apocalyptic consequences.
Unless privacy is adequately protected, the progress of pervasive computing will be slowed down
or derailed altogether.
Pervasive devices with inherent communication capabilities might even self-organize into
huge, opportunistic sensor networks (Lilien,
Kamal, Bhuse, & Gupta, 2006; Lilien, Gupta, &
Yang, 2007) able to spy anywhere, anytime, on
everybody and everything within their midst.
Without proper means of detection and neutralization, no one will be able to tell which and how
many snoops are active, what data they collect,
and who they are loyal to. Questions such as Can
I trust my refrigerator? will not be jokesthe
refrigerator will be able to snitch on its owners
dietary misbehavior to the owners doctor.
We might ask these serious questions, notwithstanding their humorous appearance. Will
pervasive computing force us to abandon all hope
for privacy? Will a cyberfly, with high-resolution
camera eyes and supersensitive microphone ears,
end privacy as we know it?1 Should a cyberfly be
too clever to end up in the soup, the only hope
might be to develop cyberspiders. But cyberbirds
might eat those up. So, we will build a cybercat.
And so on and so forth.
Radically changed reality demands new approaches to computer security and privacy. We
believe that we should talk about a new privacy
categorynamely, privacy of artificial entities.
We think that socially-based paradigms, such

9

as trust-based approaches, will play a big role


in pervasive computing. As in social settings,
solutions will vary from heavyweight ones for
entities of high intelligence and capabilitiessuch
as humans and intelligent systemsinteracting
in complex and important matters, to lightweight
ones for less intelligent and less capable entities
interacting in simpler matters of lesser consequence.

Interplay of privacy and trust


Privacy and trust can be in a symbiotic or in an
adversarial relationship. We concentrate here on
the latter, when users in interactions with businesses and institutions face tradeoffs between a
loss of their privacy and the corresponding gain of
trust by their partners. An example of a symbiotic
relationship is the situation when a better privacy
provided by a commercial Web site results in its
customers higher degree of trust.
Users entering an online interaction want to
gain a certain level of trust with the least loss
of their privacy. This is the level of trust that is
required by an interaction partner, for example,
a Web site, to provide a needed service, for example, an online purchase of a gift. The interaction
partner will ask a user for sensitive information
such as certain credentials, for example, the users
credit card number and other cardholder information. These credentials, when provided online, are
indeed digital credentialsdespite the fact that
non-digital credentials, such as a plastic credit
card, are their basis.
This simple scenario shows how privacy and
trust are intertwined. The digital credentials are
used to build trust, while providing the credentials
reduces the degree of privacy of their owner. It
should be noted that in a closed environment a user
could receive certain service while revealing much
less private information. For example, a student
can order free educational software just by logging into a password-protected account, without
any need for providing his credit card information.

Privacy and Trust in Online Interactions

Obviously, entering only ones login and password


reveals less sensitive data than providing ones
credit card information.
We can not expect that privacy and trust are
provided for free or traded for freeunder any
cost measures. Only in an ideal world we would
never loose our privacy in any interaction, would
be fully trusted at the same time, and would be
provided these benefits at no cost. In reality, we can
only approach this optimum by providing minimal
privacy disclosuresones that are absolutely
necessary to gain a level of trust required by the
interaction partners. The mechanisms providing
minimal privacy disclosures and trust carry costs,
including costs of computation, communication,
storage, and so forth.
It is obvious that gaining a higher level of trust
may require a larger loss of privacy. It should also
be obvious that revealing more sensitive information beyond certain point will produce no more
trust gains, or at least, no more useful trust gains.
For example, a student wishing to enter a tavern
must show a proof of his age, exchanging a loss
of privacy for a trust gain. Showing his driver
license is entirely sufficient, and showing his
passport and birth certificate produces no more
trust gains.
It should also be obvious that for each required
level of trust we can determine (at least in theory)
the minimal loss of privacy required to produce
this level of trust. This means that users can (and
usually want) to build a certain level of trust with
this minimal loss of privacy. We want to automate
the process of finding this optimal privacy-fortrust tradeoff, including automatic evaluation
of a privacy loss and a trust gain. To this end,
we must first provide appropriate measures of
privacy and trust, and then quantify the tradeoff
between privacy and trust. This quantification
will assist a user in deciding whether or not to
trade her privacy for the potential benefits gained
from trust establishment. A number of questions,
including the following, must be answered. How
much privacy is lost by disclosing a specific

piece of information? How much trust is gained


by disclosing given data? How much does a user
benefit by having a given trust gain? How much
privacy a user is willing to sacrifice for a certain
amount of trust gain? Only after answering these
questions, we can design algorithms and mechanisms that will assist users in making rational
privacy-for-trust decisions. Proper mechanisms
can empower a users decision making process, or
even automate it based on policies or preferences
predefined by the user.

related Work
Related Work on Privacy
Many conferences and journals, not only in
the area of computer science or other technical
disciplines, focus on privacy. We can mention
only a few publications that affected our search
for a privacy-for-trade solution presented in this
chapter.
Reiter and Rubin (1999) use the size of the
anonymity set to measure the degree of anonymity for senders or receivers. The anonymity set
contains all the potential subjects that might have
sent or received data. The size of the anonymity
set does not capture the fact that not all senders
in the set have an equal probability of sending a
message. This may help the attacker in reducing
the size of the set of potential senders. Therefore,
the size of the anonymity set may be a misleading measure, showing a higher degree of privacy
than it really is.
Another approach (Diaz, Seys, Claessens, &
Preneel, 2003; Serjantov & Danezis, 2003) uses
entropy to measure the level of privacy that a
system achieves. Differential entropy is used by
Agrawal and Aggarwal (2001) to quantify the
closeness of an attribute value, as estimated by
an attacker, to its original value. These papers
assume a static model of an attacker, in the sense
that the attacker does not accumulate information
by watching the system over the time.

9

Privacy and Trust in Online Interactions

The Scrub system (Sweeney, 1996) can be


used to de-identify personal patients information. Privacy is ensured by filtering identifying
information out of data exchanged between
applications. The system searches through prescriptions, physician letters, and notes written by
clinicians to replace with generic data information
identifying patientssuch as their names, phone
numbers, and addresses. A database of personally-identifying informationsuch as first and
last names, addresses, phones, social security
numbers, employers, and birth datesis used
to detect the occurrences of such information.
In addition, the system constructs templates for
different information formats, for example, different formats for phone numbers and dates. These
templates are used to detect variants of personal
information.
Collecting pieces of information from different
sources and putting them together to reveal personal information is termed data fusion (Sweeney,
2001a). Data fusion is more and more invasive
due to the tremendous growth of information on
individuals being electronically gathered (Sweeney, 2001b). The Scrub system does not provide
a sufficient protection against data fusion, that
is, it does not assure complete anonymity. The
Datafly system (Sweeney, 1998; Sweeney, 2002b)
maintains anonymity even if data are linked with
other information sources. While maintaining a
practical use of data, Datafly automatically aggregates, substitutes, and removes information
to maintain data privacy. Datafly achieves data
privacy by employing the k-anonymity algorithm
(Sweeney, 2002b), which provides a formal guarantee that an individual can not be distinguished
from at least k - 1 other individuals.
Platform for privacy preferences (P3P) is the
best-known protocol and tool suite for specifying
privacy policies of a Web site and preferences of
Web users (Cranor , 2003). P3P is not intended to
be a comprehensive privacy solution that would
address all principles of Fair Information Practices
(Trade Commission, 1998). AT&T Privacy Bird

9

is a prominent implementation of P3P (Privacy


Bird, 2004). It is a tool that can be added to a
Web browser to keep its users aware of privacy
policies of the visited Web sites.
We do not discuss here general security solutions which contribute to privacy protection.
Examples include protecting software, mobile
objects, or agents from many types of attacks
by either: (i) running them only on dedicated
and tamper-resistant platformsfor example,
on secure coprocessors (Tygar & Yee, 1994); or
(ii) by providing security on commodity hardwarefor example, partitioning a single hardware
platform into many isolated virtual machines
or closed boxes (Garfinkel, 2003). Examples
include also protection of a software client (code)
from a malicious host by obfuscating, tamperproofing, or watermarking the code (Collberg &
Thomborson, 2000).

Related Work on Trust


The problem of establishing and maintaining trust
in dynamic settings has attracted many researchers. One of the first formalized models of trust in
computer science (Marsh, 1994) introduced the
concepts widely used by other researchers, such
as context and situational trust.
A comprehensive social trust model, based on
surveying more than 60 papers across a wide range
of disciplines, has been proposed by McKnight
and Chervany (2001). It has been validated via an
empirical experimental study (McKnight, Choudhury, & Kacmar, 2002). The model defines five
conceptual trust elements: trusting behavior, trusting intention, trusting belief, institution-based
trust, and disposition to trust (cf. Cofta, 2006).
First, trusting behavior is an action that increases a
trusters risk or makes the truster vulnerable to the
trustee. Second, trusting intention indicates that a
truster is willing to engage in trusting behaviors
with the trustee. A trusting intention implies a
trust decision and leads to trusting behaviors. Two
subtypes of trusting intention are: (i) willingness

Privacy and Trust in Online Interactions

to depend, that is, the volitional preparedness to


make oneself vulnerable to the trustee; and (ii)
subjective probability of depending, that is, the
likelihood that a truster will depend on a trustee.
Third, trusting belief is a trusters subjective belief
in the fact that a trustee has attributes beneficial to
the truster. The followings are the four attributes
used most often: (i) competence: a trustee has the
ability or expertness to perform certain tasks; (ii)
benevolence: a trustee cares about a trusters interests; (iii) integrity: a trustee is honest and keeps
commitments; and (iv) predictability: a trustees
actions are sufficiently consistentso future action can be predicated based on the knowledge of
previous behavior. Fourth, institution-based trust
is the belief that proper structural conditions are
in place to enhance the probability of achieving
a successful outcome. Two subtypes of institution-based trust are: (i) structural assurance: the
belief that deployed structures promote positive
outcomes, where structures include guarantees,
regulations, promises and so forth; and (ii)
situational normality: the belief that the properly ordered environments facilitate successful
outcomes. Fifth, disposition to trust characterizes a trusters general propensity to depend on
others across a broad spectrum of situations. Two
subtypes of disposition to trust are: (i) faith in
humanity: the general assumptions about trustees
integrity, competence, and benevolencethat is,
a priori trusting beliefs; and (ii) trusting stance:
a preference for the default trust-based strategy
in relationships.
Zacharia and Maes (2000) proposed two reputation systems, SPORAS and HISTOS. Reputations in SPORAS are global, that is, a principals
reputation is the same from the perspective of any
querier. HISTOS has the notion of personalized
reputation, that is, different queriers may get different reputation values about the same principal.
In addition to the reputation value, a reputation
deviation is provided to measure the reliability of
the value. Discarding a notorious identity, used
in many systems by dishonest parties to shed

their unfavorable reputations, is unprofitable in


SPORAS and HISTOS, because a newcomer starts
with the lowest reputation value. Carbo, Molina,
and Davila (2003) propose a trust management
approach using fuzzy reputation. The basic idea
is similar to that of SPORAS.
A distributed personalized reputation management approach for e-commerce is proposed by
Yu and Singh (2002a, 2002b). The authors adopt
the ideas from the Dampster-Shafer theory of
evidence to represent and evaluate reputation. If
two principals a and b have direct interactions,
b evaluates as reputation based on the ratings
of these interactions. This reputation is called
a local belief. Otherwise, b queries a so called
TrustNet for other principals local beliefs about
a. The reputation of a is computed based on the
gathered local beliefs using the Dampster-Shafer
theory. How to build and maintain the TrustNet
is not mentioned in the papers. Aberer and Despotovic (2001) simplify this model and apply it
to manage trust in a P2P system.
Sabater and Sierra (2002) propose a reputation
model for gregarious societies called a Regret
system. The authors assume that a principal owns
a set of sociograms describing the social relations
in the environment. The Regret system structure
has three dimensions. The individual dimension
models the direct experience between two principals. The social dimension models the information coming from other principals. The ontology
dimension models how to combine reputations
on different aspects. Different reputations are
defined: witness reputation, neighborhood reputation, and system reputation. The performance of
this approach highly depends on the underlying
sociograms. The paper does not discuss how to
build sociograms.
Bayesian analysis approach to model reputation and trust is used by Mui (2002) and Mui,
Mohtashemi, and Halberstadt (2002). Many
reputation models and security mechanisms assume the existence of a social network (Barnes &
Cerrito, 1998). Pujol, Sangesa, and Delgado (2002)

9

Privacy and Trust in Online Interactions

propose an approach to extract reputation from the


social network topology that encodes reputation
information. Morinaga, Yamanishi, Tateishi, and
Fukushima (2002) propose an approach to mining
product reputations on the Web.

Related Work on Privacy-Trust


Optimization
Yu, Winslett, and Seamons (2003) investigated
automated trust negotiation (ATN) considering
the issues of iteratively exchanging credentials
between two entities to incrementally establish
trust. This approach considers the tradeoff between the length of the negotiation, the amount
of information disclosed, and the computation
effort. The major difference between ATN and the
proposed research is that we focus on the tradeoff
between privacy and trust. Our research leads to
a method for estimating the privacy loss due to
disclosing a piece of information, and ways for
making rational decisions.
Wagealla, Carbone, English, Terzis, Lowe, and
Nixon (2003) present a formal model for trustbased decision making. An approach is provided
to manage trust lifecycle with considerations of
both trust and risk assessments. This approach and
our research on trust and evidence formalization
(Bhargava & Zhong, 2002; Zhong, 2005), can be
extended to use trustworthiness of an information receiver to decide whether or not to disclose
private information to him.
Seigneur and Jensen (2004) propose an approach to trade minimal privacy for the required
trust. Privacy is based on a multiple-to-one linkability of pieces of evidence to a pseudonym, and
is measured by nymity (Goldberg, 2000). The
authors assume the presence of a partial order of
nymity levels for the measurement of privacy. Our
research approach employs multiple-to-multiple
relationships between pieces of evidence and
private attributes.

trAdIng prIvAcy for trust


problems in trading privacy for
trust
To gain trust, a user must reveal private digital
credentialscertificates, recommendations, or
past interaction histories. She is faced with a
number of tough questions:

How much privacy is lost by disclosing a


specific credential? To make the answer even
more difficult, the amount of privacy loss
is affected by credentials and information
disclosed in the past.
How many credentials should a user reveal?
If alternative credentials are available (e.g.,
both a driver license and a passport indicate
birth data), which one or ones should be
revealed?
What is the trust gain obtained by disclosing a given credential? Also, what is the
minimal degree of privacy that must be
sacrificed to obtain a required trust gain?
Which credentials should be presented to
satisfy this minimum requirement?
How much does a user benefit by having a
given trust gain?
How much privacy is a user willing to sacrifice for a certain amount of trust gain?

These questions alone show how complex and


difficult is optimization of the privacy-for-trust
exchange. Obtaining an optimal solution without a
technical support is practically impossible. There
is only a small chance that intuitive approaches
to this process would result in outcomes close to
the optimal results.

A solution for trading privacy for


trust
This section presents our proposed solution facilitating privacy-for-trust trading and enabling

9

Privacy and Trust in Online Interactions

optimal outcomes of this process. It discusses


proposed approaches for building and verifying
trust, protecting privacy, and trading privacy
for trust.

Building and Verifying Trust


We focus on methods of building trust in opened
and dynamic computing environments, which are
more challenging than the closed and static settings. Digital credentials are common means of
building trust in open environments. Credentials
include certificates, recommendations, or past
transaction histories (Farrell & Housley, 2002;
Fujimura & Nishihara, 2003). Since credentials
contain private information, presenting them in
hopes of gaining trust means privacy losses. We
need to consider problems with credentials, including their imperfect and non-uniform trustworthiness. Since no credentials are perfect, means to
verify trust gained by showing them are necessary.
We present basic ways of verifying trust.

A. Trust Metrics
Trust cannot be built or verified without having
measures of trust or trust gain. We propose a
three-step method for defining a trust gain metric.
In the first step, we determine multilevel trust
metrics with n trust levels, measured on a numeric
scale from 1 to n, where n could be an arbitrarily
large number. Such metric is generic, applicable to
a broad range of applications, with the value of n
determined for a particular application or a set of
applications. The case of n = 2 reduces multilevel
trust to the simplistic case of binary trust (it might
still be useful in simple trust-based applications),
with trust levels named, perhaps, full_service and
no_service. Selecting, for example, n = 5 results
in having five trust levels that could be named:
no_service, minimal_service, limited_service,
full_service, and privileged_service going from
the lowest to the highest level.

Trust levels could be defined by a service


provider, the owner of a Web site on which it
resides (who might be the same or different from
the service provider), or any other entity that is
an intermediary between the service provider and
the customer. The number of levels n could be increased when the site outgrows its old trust metric,
or when the user becomes more sophisticated and
needs or wants to use more trust levels.
In the second step, a trust benefit function
B(ti), associated with each trust level ti, needs be
defined. The default trust benefit function for a
service can be defined by the same party that
defined trust levels in the preceding step. An
optional trust benefit function, overriding the
default one, can also be defined by an individual
customer, allowing for a more user-specific benefit metric.
In the third step, trust gain, denoted by G(t2,
t1), can be calculated based on the benefit function. G(t2, t1) indicates how much a user gains
if the users trust level, as seen by the users
interaction partner, increases from t1 to t2. The
following simple formula is used to compute the
trust gain:
trust_gain = G(new_trust_level, old_trust_level) =
B(new_trust_level) - B(old_trust_level)

B. Methods for Building Trust


Some of many generic means of building trust are
listed in Table 2. They include familiarity with
the entity to be trusted or its affiliation with a
familiar entity, as well as building trust by firsthand experience or second-hand reputation.
Rather than looking at ways of building trust
in general, we differentiate them depending on
the relative strengths of the interacting parties.
The strength of a party P1 participating in an
interaction with another party, P2, is defined by
P1s capability to demand private information
from P2, and P1s means available in case when
P2 refuses to comply. As a simple example,

9

Privacy and Trust in Online Interactions

Table 2. Basic means of building trust among partners


Building trust by familiarity with X
Person: face, voice, handwriting, and so forth.
Institution: company name, image, good will, and so forth.
Artifact: Manufacturer name, perceived quality, and so forth.
Building trust by affiliation of X with person/institution/artifact Y
Trust or distrust towards Y rubs off on X
Building trust by first-hand experience with Xs activities/performance
Good or bad experience (trust or distrust grows)
Building trust by second-hand reputation of X determined by evidence or credentials
Reputation databases (e.g., BBB, industry organizations, etc.) with good evidence or
a lack of bad evidence
Credentials: driver license, library card, credit card

a bank is stronger than a customer requesting a


mortgage loan. As another example, two small
businesses negotiating a contract are, in most
cases, equally strong.
We concentrate on different-strength trust
relationships, in which one party is stronger and
another weaker, for example, trust relationships
between individuals and institutions, or between
small businesses and large businesses. We ignore
trust relationships with same-strength partners,
such as individual-to-individual interactions and
most B2B interactions. We will interchangeably use the terms: a weaker partner and a
customer, as well as a stronger partner and
a company.
Example means of building trust by a company
in a customer include receiving a cash payment for
a service provided, or checking partners records
in the e-Bay reputation databases. Example means
of building trust by a customer in a company
include asking friends about companys reputation, or checking its reputation in Better Business
Bureau databases.
Multiple means of building trust by a stronger
partner in the weaker partner are shown in Table
3. They can assist a company in a fight against
fraud attempts by its dishonest customer. All
these means can be divided into means preserving
privacy of the weaker partner, and the means not
preserving privacy. Only the first item listed in

98

Table 3 (Ask partner for an anonymous payment


for goods or services) belongs to the privacypreserving meansby the virtue of preserving
customers anonymity. All others compromise
customers privacy and result in disclosing private information. This indicates that, much more
often than not, successful interactions with a
stronger party require that a weaker party trades
its privacy loss for a trust gain required by this
stronger principal.
There are also multiple means of building trust
by a weaker partner in the stronger partner, with
some of them shown in Table 4. All these means
can assist a customer in a fight against fraud attempts by a company. It is clear that customers
major weapon is information about the company
and its reputation.

C. Methods for Verifying Trust


Since no credentials are perfect, means to verify
trust are necessary. This is as true in computing
as in social life.2 The basic ways of verifying trust
are shown in Table 5.
Verification must be careful, not based on mere
appearances of trustworthiness, which could be
easily exploited by fraudsters. The cyberspace can
facilitate more careful verification than is the case
in the offline world, in which such verification
might be too costly or too inconvenient. Quite

Privacy and Trust in Online Interactions

Table 3. Means of building trust by a stronger partner in her weaker partner


Ask partner for an anonymous payment for goods or services
Cash / digital cash / other
---------- above privacy-preserving, below privacy-revealing ---------Ask partner for a non-anonymous payment for goods or services
Credit card / travelers checks / other
Ask partner for specific private information
Check partners credit history
Computer authorization subsystem observes partners behavior
Trustworthy or not, stable or not,
Problem: Needs time for a fair judgment
Computerized trading system checks partners records in reputation databases
e-Bay, PayPal,
Computer system verifies partners digital credentials
Passwords, magnetic and chip cards, biometrics,
Business protects itself against partners misbehavior
Trusted third party, security deposit, prepayment, buying insurance,

often a business order sent from Company A to


Company B is processed without verification. The
reasons, in addition to costs and convenience,
include the following factors: (i) implicit trust
prevails in business; (ii) risk of fraud is low among
reputable businesses; and (iii) Company B might
be insured against being cheated by its business
partners, that is, a trusted third-party intermediary might assume transaction risk (for example a
buyers bank could guarantee a transaction).

protecting privacy
Protecting privacy requires defining privacy
metrics as a prerequisite. Privacy measures are
discussed first, and methods for protecting privacy,
relying on metrics, are presented next.

A. Privacy Metrics
We cannot protect privacy if we do not know
how to measure it. This indicates the importance
of privacy metrics. More specifically, we need
a privacy metric to determine what degree of
data and communication privacy is provided by
given protection methods. The metric has to work

in any existing or future combination of users,


techniques, and systems. It has to support or deny
claims made by any such combination that a certain level of privacy will be maintained by it.
This gives rise to at least two heterogeneityrelated challenges. First, different privacy-preserving techniques or systems claim different
degrees of data privacy. These claims are usually
verified using ad hoc methods customized for each
technique and system. While this approach can
indicate the privacy level for each technique or
system, it does not allow comparisons of diverse
techniques or systems.
Second, privacy metrics themselves are usually ad hoc and customized for a user model and
for a specific technique or system.
Requirements for good privacy metrics call
for unified and comprehensive privacy measures
to provide quantitative assessments of degrees
of privacy achieved by a broad range of privacypreserving techniques. A good privacy metric has
to compare different techniques/systems confidently. It also has to account for: (i) operation of
a broad range of privacy-preserving techniques;
(ii) dynamics of legitimate userssuch as how
users interact with the system and awareness that

99

Privacy and Trust in Online Interactions

Table 4. Means of building trust by a weaker partner in his stronger partner


Ask around
Family, friends, co-workers,
Check partners history and stated philosophy
Accomplishments, failures and associated recoveries,
Mission, goals, policies (incl. privacy policies),
Observe partners behavior
Trustworthy or not, stable or not,
Problem: Needs time for a fair judgment
Check reputation databases
Better Business Bureau, consumer advocacy groups,
Verify partners credentials
Certificates and awards, memberships in trust-building organizations (e.g., BBB),
Protect yourself against partners misbehavior
Trusted third party, security deposit, prepayment, buying insurance,

repeated patterns of data access can leak information to a violator; (iii) dynamics of violatorssuch
as how much information a violator may gain by
watching the system for some time; and (iv) costs
associated with a metric implementationsuch
as injected traffic, delays, CPU cycles, and storage use.
We proposed two metrics for assessing the
privacy achieved by a given system: an anonymity
set size metric and an information-theoretic metric
also known as an entropy-based metric. The first
metric can provide a quick estimate of privacy,
while the second gives a more detailed insight into
the privacy aspects of the system it measures.

Table 5. Basic ways of verifying trust toward


entity X
Verify own experience with X
Check own notes about Xs activities / performance
Verify reputation evidence / credentials for X
Call back to verify phone number
Check online user feedback about quality of an artifact
Check reputation database (e.g., consumer reports, BBB)
Verify affiliation of X
Check with employer if X still employed
Check reputation of Y with which X is affiliated

00

A.1. Effective Anonymity Set Size Metric: Since


anonymity is defined as the state of being indistinguishable within a set of subjects (Pfitzmann
& Khntopp, 2000), we can use the size of the
anonymity set as a privacy metric. The basic idea
is that of hiding in a crowd. As illustrated in
Figure 1, hiding among n entities provides more
privacy than hiding among 4 entities (for n >>
4). Clearly, the larger the set of indistinguishable
entities, the lower probability of identifying any
particular one. This approach can be generalized
to anonymize not only identities of entities but
also the values of their attributes: a selected attribute value is hidden within the domain of its
all possible values.
We need to present this metric in more detail.
The set of subjects, or values, known as the anonymity set, is denoted by A. Using the size of the
anonymity set directly may indicate a stronger
privacy than what it really is. The probability
distribution that the violator can assign to individual subjects of the set should be considered.
To illustrate this problem, consider a system
claiming that a subject receiving data cannot be
distinguished from |A| other subjects belonging
to the anonymity set A. Suppose that a violator

Privacy and Trust in Online Interactions

Figure 1. Hiding in a crowd underlies the metrics based on the anonymity set size
in a crowd
HidingHiding
in a crowd

Less anonymous (1/4)


More anonymous (1/n)

has noticed that a half of the nodes in A rarely


receive messages. Then, he assigns to these nodes
a very low probability of receiving a data item.
The violator has effectively reduced the anonymity set size to |A|/2. To counter this problem, we
define the anonymity set as: A = {(s1, p1), (s2, p2),
, (sn, pn)}, where pi represents the probability
assigned to Subject si. Thus, we can determine
the effective anonymity set size as:
| A|

L = | A | min( pi , 1/ | A |)

(1)

with the maximum possible entropy value, we


can learn how much information the attacker has
gained about the system. Therefore, the privacy
of a system can be measure based on how much
of its private information was revealed.
a.) Entropy calculation example: Privacy loss
D(A, t) at time t, when a subset of attribute values
A might have been disclosed, is given by:
D(A,t) = H* (A) H(A,t)

Note that the maximum value for L is |A|. L


equals |A| when all entities in A are equally likely
to access data, that is, pi = 1/|A|, 1 i n. Equation (1) captures the fact that the anonymity set
size is effectively reduced when the probability
distribution is skewed, that is, when some entities have a higher probability of accessing data
than the others.

where H*(A) is the maximum entropy (computed


when probability distribution of pis is uniform),
and H(A, t) is entropy at time t given by:

A.2 Information-theoretic (entropy-based)


metric: Entropy measures the randomness in a
system, and therefore, it measures the uncertainty
that one has about that system (Cover & Thomas,
1991). Building on this notion, we propose to
use entropy to measure the level of privacy that
a system achieves at a given moment. The idea
is that when an attacker gains more information
about the system, the uncertainty about subjects
that send or receive data, and thus their entropy, is
decreased. By comparing a current entropy value

Consider a private phone number: (a1 a2 a3) a4 a5 a6


a7 a8 a9 a10, where the first three digits constitute
the area code. Assume that each digit is stored as a
value of a separate attribute. Assume further that
the range of values for each attribute is [09],
and that all attributes are equally important, that
is, for each j [1-10], wj = 1.
The maximum entropy exists when an attacker has no information about the probability
distribution of the values of the attributes. In
such a case, the attacker must assign a uniform

| A|

(2)
H ( A, t ) = w j ( pi log 2 ( pi ))
j =1
i

with wj denoting weights that capture relative


privacy value of the attributes.

0

Privacy and Trust in Online Interactions

probability distribution to the attribute values.


Thus, aj = i with pi = 0.1 for each j and for each
i, and we get:
9
10

H * ( A) = w j (0.1 log 2 (0.1)) = 33.3


j =0
i =1

Suppose that after time t, the attacker can


figure out the state in which the phone number
is located. This may allow the attacker to learn
the three leftmost digits (at least for states with a
single area code). Entropy at time t is given by:
10
9

H ( A, t ) = 0 + w j (0.1 log 2 (0.1)) = 23.3


j =4
i =0

Note that attributes a1, a2, and a3 contribute 0 to


the entropy value because the attacker knows their
correct values. Information loss at time t is:
D(A,t) = H * (A) H(A,t) = 10.0
b.) Decrease of system entropy with attribute
disclosures: Decrease of system entropy with
attribute disclosures is illustrated in Figure 2.
The lighter circles indicate the size of the attribute set for private data under consideration, the
smaller darker circles within them indicate the
sizes of subsets of disclosed attributes, the vertical lines to the left of the lighter circles indicate
the maximum entropy H*, and vertical bars to the
left of the lighter circles (superimposed on the H*
lines) indicate the current entropy level. Let us
first consider cases (a) (c) in which we assume
a fixed size of private data. This fixed size of private data explains why the lighter circles in these
three cases have the same size. When entropy of
data reaches a certain higher threshold value H2,
as shown in case (b), a controlled data distortion
method, increasing entropy of these data, must
be invoked to protect their privacy. Examples of
distortion mechanisms include generalization and
suppression (Sweeney, 2002a) or data evaporation
(Lilien & Bhargava, 2006). When entropy of
data drops below a certain lower threshold level
H1, as shown in case (c), data destruction must

0

be triggered to destroy all private data. Data destruction is the ultimate way of preventing their
disclosure.
Let us add a bit more detail to this example.
Consider private data that consists of three attributes: a name, a social security number, and a
zip code. Each attribute has a domain of values.
The owner of private data first computes the
maximum entropy H* for all three attributes. The
owner also determines two values for entropy
mentioned above: the higher value H2 (the threshold for triggering controlled data distortions), and
the lower value H1 (the threshold for triggering
data destruction). Each time when private data is
shared or transferred from one entity to another,
the new value of entropy, Hnew, is calculated using
Equation (2). If Hnew stays above H2, no privacypreserving actions are needed. If Hnew drops below
H2 but stays above H1 (H2 > Hnew > H1), a controlled
data distortions method is invoked to increases
data entropy. Finally, if Hnew drops below H1 (H1 >
Hnew), data destruction must be invoked to protect
data privacy.
The example assumed that the size of the
private data attribute set is fixed. The entropy
metric can be extended to cases when the private
data attribute set is allowed to grow or shrink.
Case (d) in Figure 2, when compared with case
(c), illustrates the situation in which private data
attribute set grew. This growth is indicated by the
larger diameter of the lighter circle, indicating a
larger attribute set for sensitive data. The sizes
of subsets of disclosed attributes, indicated by
the darker circles, are identical in cases (d) and
(c); do not be fooled by the optical illusion that
the darker circle in (d) is smaller than the darker
circle in (c). As a result, entropy for case (d) is
higher than for case (c), as indicated by a higher
vertical bar for case (d). This utilizes the principle
of hiding in the (larger) crowd.
Entropy can be increased not only by increasing the size of the private data attribute set, as
just demonstrated, but also by making its subset
of disclosed attributes less valuable. For example,

Privacy and Trust in Online Interactions

Figure 2. Dynamics of entropy

suppose that a bank releases the current account


balance of a customer to an insurer. This balance
is valid for a specific period of time. After this
period, the value of knowing this private piece
of information decreases, because the customer
could have changed her balance. In computing the
new value of entropy, the balance is assumed to
be private again. This leads to a gradual increase
in entropy. In another example, a bank can increase entropy rapidly: to make stolen credit card
numbers useless, it quickly changes credit card
numbers for all compromised accounts.

B. Methods for Protecting Privacy


Privacy controls for sensitive data are necessary.
Without them, many interaction opportunities are
lost. Examples are patients symptoms hidden
from doctors, given up business transactions,
lost research collaborations, and rejected social
contacts
Privacy can be supported by technical or legal
controls. Examples of legal controls are the EPA
privacy act (Privacy Act, 2004), and HIPAA
Privacy Rule (Summary HIPAA, 2003; Mercuri,
2004) intended to protect privacy of individuals.
Yet, there are many examples of privacy violations even by federal agencies. The sharing of
travelers data by JetBlue Airways with the federal government was one such incident (Privacy
Act, 2004).
Technical controls for facilitating or enabling
privacy controls must complement legal controls.

Such privacy protection mechanisms should


empower users (peers, nodes, etc.) to protect
user identity, privacy of user location and movement, as well as privacy in collaborations, data
warehousing, and data dissemination. Each party
that obtained other partys sensitive data through
an interaction must protect privacy of these data.
Forwarding them without proper privacy guarantees to other entities could endanger partners
privacy.
Both stronger and weaker party should be
assisted with technical solutions. On the one
hand, the responsibility of the stronger partner
for protecting privacy is larger. The reason is
that the stronger partner obtains more sensitive
data of her counterpart then a weaker partner.
In many cases, the stronger partner might be the
only party that obtains private data. On the other
hand, the weaker partner should not rely entirely
on the integrity of the stronger counterpart. He
needs mechanisms to protect sensitive data released by him evenor especiallywhen they
are out of his hands.
This means that at least the following two
kinds of mechanisms are needed. The first
one must assist in minimizing the amount of
private information that is disclosed. A system
for privacy-for-trust exchange, presented in this
chapter, is an example of a mechanism of this
kind. Mechanisms of the second kind provide
protection for further dissemination of sensitive
data that are already disclosed, setting clear and
well-defined limits for such dissemination. They

0

Privacy and Trust in Online Interactions

assure that data disclosed to a stronger party are


not freely disseminated by her to other entities.
For example, a mechanism of this kind assures
that only some of data revealed by a patient to his
doctor are forwarded by the doctor to an insurer
or a nurse, and most sensitive data are never
forwarded to anybody.3 An example of this kind
is the solution named P2D2 (privacy-preserving
data dissemination) (Lilien & Bhargava, 2006),
which enables control of further dissemination
of sensitive data by integrating privacy protection mechanisms with the data they guard. P2D2
relies on the ideas of: (i) bundling sensitive data
with metadata specifying privacy preferences
and policies; (ii) an apoptosisthat is, a clean
self-destruction (Tschudin, 1999)of endangered
bundles; and (iii) an adaptive evaporation of
bundles in suspect environments.
B.1. Technical privacy controls: Technical privacy controls, also known as privacy-enhancing
technologies (PETs), can be categorized into the
following categories (Fischer-Hbner, 2001):

Protecting user identities


Protecting usee identities
Protecting confidentiality and integrity of
personal data

We take a look at these categories of technologies in turn.


a. Protecting user identities (Fischer-Hbner,
2001): User identities can be protected via
anonymity, unobservability, unlinkability, and
pseudonymity of users. First, anonymity ensures
that a user may use a resource or service without
disclosing the users identity (Class FPR, 1999).
The special cases of anonymity are: sender
and receiver anonymity, ensuring that a user is
anonymous in the role of a sender or a receiver,
respectively, of a message.
We can define the following six degrees of
sender anonymityfrom the one fully protect-

0

ing the sender, to the one exposing the sender


(Fischer-Hbner, 2001): (i) absolute privacy,
when the sender enjoys full privacy with respect
to being considered as the sender of a message;
(ii) beyond suspicion, when the sender appears
to be no more likely to be the originator of a
message than any other potential sender in the
system; (iii) probable innocence, when the sender
appears to be no more likely to be the originator
of a message than not to be its originator; (iv)
possible innocence, when there is a nontrivial
probability that the real sender is someone else;
(v) exposed, when the sender is highly likely to
be the originator of a message; and (vi) provably
exposed, when the sender is identified beyond any
doubt as the originator of a message.
Second, unobservability ensures that a user
may use a resource or service without others being able to observe that the resource or service is
being used (Class FPR, 1999). Third, unlinkability
ensures that a user may make use of resources and
services without others being able to link these
uses together (ibid). Its special case is unlinkability
of a sender and a recipient, when a sender and a
recipient cannot be identified as communicating
with each other. Fourth, pseudonymity ensures
that a user acting under a pseudonym may use a
resource or service without disclosing his identity
(ibid).
b. Protecting usee identities (Fischer-Hbner,
2001): In this case, the protected entity is the usee,
that is, the subject described by data. Usee identities can be protected, for example, via depersonalisation, providing anonymity or pseudonymity
of data subjects.
Depersonalisation (anonymization) of data
subjects can be classified as a perfect depersonalization, when data are rendered anonymous in
such a way that the usee (data subject) is no longer
identifiable, and a practical depersonalization,
when a modification of personal data is such that
information concerning personal or material circumstances can either no longer be attributed to

Privacy and Trust in Online Interactions

an identified or identifiable individual, or can be


so attributed only with a disproportionate amount
of time, expense and labor. Attackers attempt to
circumvent depersonalization by reidentification
attacks.
c. Protecting confidentiality and integrity of
personal data (Fischer-Hbner, 2001): Confidentiality and integrity of personal data can be
protected by a number of methods and technologies, including privacy-enhanced identity
management, access controls, using enterprise
privacy policies, making data inaccessible with
cryptography or steganography (e.g., hiding a
message in an image), and utilizing specific tools
(such as platform for privacy preferences or P3P;
Marchiori, 2002).
B.2 Legal privacy controls: For completeness
of our presentation, we will take a look, albeit a
quick one, at legal privacy controls.
Despite the fact that definitions of privacy vary
according to context and environment, the belief
that privacy is a fundamental human right, even
one of the most important rights of the modern
age, is internationally recognized .At a minimum,
individual countries protect inviolability of the
home and secrecy of communications (Green,
2003).
There are two types of privacy laws in various
countries (ibid): comprehensive laws and sectoral
laws. Comprehensive laws are general laws that
govern the collection, use and dissemination
of personal information by public and private
sectors. They are enforced by commissioners or
independent enforcement bodies. The disadvantages of this approach include lack of resources
for oversight and enforcement agencies, as well
as the governmental control over the agencies.
Comprehensive privacy laws are used in Australia, Canada, the European Union, and the United
Kingdom.
Sectoral laws focus on specific sectors and
avoid general laws. They benefit from being

able to use a variety of specialized enforcement


mechanisms, selecting the ones best suited for
the sector they apply to. Their disadvantage is the
need for a new legislation for each new sectoral
technology. Sectoral privacy laws are used in the
United States.
Disparate national privacy laws require international agreements to bridge different privacy
approaches. An example is the Safe Harbor Agreement, reached in July 2000, between the United
States and the European Union (Welcome Safe,
2007). It has been criticized by privacy advocates
and consumer groups in both the United States
and European Union for inadequate enforcement,
and relying too much on mere promises of participating companies.

Trading Privacy for Trust


An interacting entity can choose to trade its
privacy for a corresponding gain in its partners
trust in it (Zhong & Bhargava, 2004). We believe
that the scope of a privacy disclosure should be
proportional to the benefit expected by the entity
that discloses its private information. For example,
a customer applying for a mortgage mustand
is willing toreveal much more personal data
than a customer buying a book.
Having measures of trust and privacy defined
above allows for precise observation of these two
quantities, and precise answers to questions such
as: (i) What degree of privacy is lost by disclosing given data? (ii) How much trust is gained
by disclosing given data? and (iii) What degree
of privacy must be sacrificed to obtain a certain
amount of trust gain?

A. Same-Strength and
Different-Strength Trust
As defined earlier, the strength of a party participating in the relationship is defined by the
partys capability to demand private information
from the other party, and the means available in

0

Privacy and Trust in Online Interactions

case when the other party refuses to comply. As a


simple example, a bank is stronger than a customer
requesting a mortgage loan. As mentioned, trust
relationships can be divided into same-strength
or different-strength.

B. Same-Strength and
Different-Strength Privacy-for-Trust
Negotiations
To realize a privacy-for-trust tradeoff, two
interacting parties, P1 and P2, must negotiate
how much privacy needs be reveled for trust.
We categorize such negotiations as either: (1)
same-strengthwhen both parties are of similar
strength; or (2) different-strengthwhen one
partys position is stronger vis--vis the others. In
turn, same-strength privacy-for-trust negotiations
can be either: (1a) privacy-revealing negotiations,
in which parties disclose their certificates or policies; or (1b) privacy-preserving negotiations, in
which parties preserve privacy of their certificates
and policies.
We compare all three kinds of privacy-for-trust
negotiationsthat is, (1a), (1b), and (2)in terms
of their behavior during the negotiations. This
behavior includes defining trust level necessary
to enter negotiations, growth of trust level during
negotiations, and the final trust level sufficient
for getting a service.
Same-strength negotiations are very popular
in the research literature. Different-strength
negotiations, to the best of our knowledge have
been defined explicitly by us.
B.1. Trust growth in same-strength trust
negotiations: Same-strength trust negotiations
involve partners of similar strength.
a. Trust growth in privacy-revealing samestrength trust negotiations: Negotiations of this
type can start only if an initial degree of trust
exists between the parties. They must trust each
other enough to reveal to each other some certifi-

0

cates and policies right away. From this point on,


their mutual trust grows in a stepwise manner as
more private information is revealed by each party.
Negotiations succeed when a sufficient mutual
trust is established by the time when the negotiation endsit is sufficient for the requestor
to obtain the desired service. This procedure is
summarized in Table 6.
b. Trust growth in privacy-preserving samestrength trust negotiations: In contrast to privacy-revealing same-strength trust negotiations,
negotiations of this type can start without any
initial trust, since they involve no risk related to
revealing ones privacy. There are no intermediate
degrees of trust established during the negotiations. They continue without mutual trust up to the
moment when they succeed or fail. They succeed
when a sufficient mutual trust is established by the
time when the negotiation ends. This procedure
is summarized in Table 7.
B.2. Trust growth in different-strength trust
negotiations: Negotiations of this type can start
only if at their start the weaker partner has a
sufficient trust in the stronger partner. This trust
is sufficient when the weaker party is ready
for revealing private information required to
start gaining stronger partys trust necessary for
obtaining its service. As negotiations continue,
the weaker partner trades a degree of privacy
loss for a trust gain as perceived by the stronger

Table 6. Trust growth in privacy-revealing samestrength trust negotiations


An initial degree of trust necessary
Must trust enough to reveal (some) certificates / policies
right away
Stepwise trust growth in each other as more (possibly
private) information about each other revealed
Proportional to the number of certificates revealed to each
other
Succeed if sufficient mutual trust established when
negotiations completed
Sufficient for the task at hand

Privacy and Trust in Online Interactions

Table 7. Trust growth in privacy-preserving samestrength trust negotiations


Initial distrust
No one wants to reveal any information to the
partner
No intermediate degrees of trust established
From distrust to trust
Succeed if sufficient mutual trust established when
negotiations completed
Sufficient for the task at hand

partner. It should be clear that the former looses


a next degree of privacy when revealing the next
private certificate to the latter. (The only exception to privacy loss is the no privacy loss case
in the anonymity-preserving example in stronger
building trust in weaker, shown in Table 3).
Negotiations succeed when, by the time when
the different-strength trust negotiations end, the
stronger party gains a sufficient trust in the weaker
party to provide it the requested service. This
procedure is summarized in Table 8.

Table 8. Trust growth in privacy-preserving different-strength trust negotiations


Initially, Weaker has a sufficient trust in Stronger
Weaker must trust Stronger sufficiently to start revealing
private information required to gain Strongers sufficient
trust
Weaker trades a degree of privacy loss for a trust gain as
perceived by Stronger
A next degree of privacy lost when a next certificate
revealed to Stronger
Sufficient trust of Stronger in Weaker established when
negotiations completed
Sufficient for the task at hand

Table 9. Summary of trust growth in samestrength


Trust growth in same-strength disclosing trust
negotiations
Initial degree of trust / stepwise trust growth /
establishing mutual full trust
Trades information for trust (information is private or
not)
Trust growth in same-strength preserving trust
negotiations (from distrust to trust)
Initial distrust / no stepwise trust growth / establishes
mutual full trust
No trading of information for trust (information is
private or not)

B.3. Summary of privacy-for-trust trading


in same-strength and different-strength Trust
Negotiations: Table 9 summarizes trust growth
in same-strength and different-strength trust
negotiations and different-strength trust negotiations

C. Privacy-for-Trust Optimization in
Different-Strength Trust Negotiations
The optimization procedure for trading privacy for
trust in different-strength trust negotiations presented, follows our approach (Zhong & Bhargava,
2004; Zhong, 2005). It includes four steps:
1.
2.
3.

Formalizing the privacy-trust tradeoff problem


Measuring privacy loss due to disclosing a
private credential set
Measuring trust gain obtained by disclosing
a private credential set

Trust growth in different-strength trust negotiations


Initial full trust of Weaker in Stronger and no trust of
Stronger in Weaker / stepwise trust growth / establishes
full trust of Stronger in Weaker
Trades private information for trust

4.

Developing a system that minimizes privacy


loss for a required trust gain

We distinguish two forms of privacy-for-trust


optimization. The first one minimizes the loss
of privacy by the weaker partner necessary for
obtaining, in the eyes of the stronger partner, a
certain trust level required to get a service. This
is the form discussed in more detail below. The
second form of optimization finds the degree of
privacy disclosure by the weaker partner necessary
for maximizing the trust level obtained from the
stronger partner. We do not discuss this form,

0

Privacy and Trust in Online Interactions

just noticing that it is needed in situations when


the weaker partners benefits obtained from the
stronger partner are proportional to the trust level
attained in the eyes of the stronger partner.
We assume that a user has multiple choices
in selecting sensitive information for disclosure.
For example, in response to an age query, a user
can show a driver license, a passport, or a birth
certificate.
C.1. Formulating the tradeoff problem: Suppose
that the private attributes we want to conceal are
a1, a2, , am. A user has a set of credentials {c1,
c2, , cn}. A credential set can be partitioned by
a service provider into revealed and unrevealed
credential subsets, denoted as R(s) and U(s),
respectively, where s is the identity of a service
provider.
The tradeoff problem can now be formulated
as follows: choose from U(s) the next credential
nc to be revealed in a way that minimizes privacy
loss while satisfying trust requirements. More
formally, this can be expressed as:
min {PrivacyLoss(nc R(s)) PrivacyLoss(R(s))
| nc satisfies trust requirements}
This problem can be investigated in two
scenarios:
1.

2.

08

Service providers never collude to discover customers private information. An


individual version R(s) is maintained for
each service provider and privacy loss is
computed based on it.
Some service providers collude to discover
customers private information. A global
version Rg that consists of all credentials
disclosed to any colluding service provider
is maintained. Since the difference between
R(s) and Rg is transparent to the procedure
for evaluation of privacy loss and trust
gain, they both are denoted as R in further
considerations.

The tradeoff problem changes to a multivariate problem if multiple attributes are taken into
consideration. It is possible that selecting nc1 is
better than nc2 for a1 but worse for a2. We assume
the existence of an m-dimensional weight vector [w1, w2, , wm] associated with these private
attributes. The vector determines the protection
priority for the private attributes a1, a2, , am,
respectively. We can minimize either: (a) the
weighted sum of privacy losses for all attributes
or (b) the privacy loss of the attribute with the
highest protection weight.
Another factor affecting the tradeoff decision is
the purpose of data collection. It can be specified
in the service providers privacy policy statement,
for instance, by using P3P (Marchiori, 2002).
Pseudonymous analysis and individual decision
are two data collection purposes defined in P3P.
The former states that collected information will
not be used in any attempts to identify specific
individuals. The latter tells that information may
be used to determine the habits, interests, or
other characteristics of individuals. A user could
make different negotiation decisions based on the
stated purpose of data collection. Furthermore,
the service providers trustworthiness to fulfill
the declared privacy commitment can be taken
into consideration.
C.2. Estimating privacy loss: We distinguish
two types of privacy losses: the query-dependent
and query-independent ones. Query-dependent
privacy loss for a credential nc is defined as the
amount of information that nc provides in answering a specific query. The following example
illustrates a query-dependent privacy loss for a
credential. Suppose that a users age is a private
attribute. The first query asks: Are you older
than 15? The second query tests the condition
for joining a silver insurance plan, and asks:
Are you older than 50? If the user has already
presented a valid driver license, we are 100%
sure that the answer to the first query is yes but
the probability of answering yes to the second

Privacy and Trust in Online Interactions

query by a person with a driver license is, say,


40%. Privacy loss for a revealed driver license is
here query-dependent since it varies for different
queries: it is a full privacy loss (100%) for the first
query, and only a partial (probabilistic) privacy
loss (40%) for the second query. This example
also makes it clear that the value of revealed
information (such as a driver license) can vary
for different queries.
Let us consider now an example illustrating a
query-independent privacy loss for a credential.
Suppose that a user has already presented her
driver license, which implies that she is older than
16. If she uses her Purdue undergraduate student
ID as the next piece of evidence, a high query-independent privacy loss ensuessince this credential
greatly reduces the probable range of her age. Let
us consider the third query asking: Are you an
elementary school student? The students ID is
redundant as a credential for this query because
the revealed driver license has already excluded
this possibility. This shows that a credential having
a high query-independent privacy loss may not
necessarily be useful to answer a specific query
(a large privacy loss with no trust gain).
Two types of methods can be used to measure
a privacy loss: probabilistic methods and the
predefined lattice method.
a. Probabilistic methods for estimating privacy
losses: There are two probabilistic methods: one
for evaluating query-dependent privacy losses,
and another for evaluating query-independent
privacy losses. More specifically, the first probabilistic method evaluates the query-independent
privacy loss for disclosing a credential ci with
respect to one attribute aj that has a finite domain
{ v1, v2, , vk }. The probability of aj = vi before
disclosing the credential is Prob(aj = vi | R). The
probability of aj = vi with a given credential ci
disclosed is Prob(aj = vi | R ci). The privacy
loss is measured as the difference between entropy
values (Young, 1971):

i =1

i =1

Pr ivacyLossa j (ci | R) = Pi log 2 ( Pi ) Pi* log 2 ( Pi* )

where Pi = Prob(aj = vi | R) and Pi* = Prob(aj =


vi | R ci).
The second probabilistic method evaluates the
query-dependent privacy loss based on the knowledge of a complete set of potential queries. Let q1,
q2, , qn denote n queries. Let pri be the probability that qi is asked, and wi be the corresponding
weight indicating the protection priority for this
query. We can now evaluate the privacy loss for
disclosing a credential ci in response to a query
qk. Suppose that there are r possible answers to
the query. The domain of an attribute aj is divided
into r subsets { qv1j , qv2j , ..., qvrj } based on the
query answer set. The privacy loss with respect
to attribute aj and query qk is computed as:
where Pi = Prob(aj qvik | R) and Pi* = Prob(aj
qvik | R ci).
The query-dependent privacy loss with respect to attribute aj is evaluated by the following
formula:
n

Pr ivacyLossa j (ci | R) = (Pr ivacyLossa j ,q k * prk * wk )


k =1

Bayes networks (Jensen, 1996) and kernel


density estimation can be used for conditional
probability estimations.
b. The predefined lattice method for estimating
privacy losses: The second type of methods that
can be used for measuring a privacy loss is represented by the predefined lattice method. This
method assumes that each credential is associated with a tag indicating its privacy level with
respect to attribute aj. The tag set is organized as
a lattice (Donnellan, 1968) in advance. Tags are
assigned to each subset of credentials as follows.
Suppose that TB and TA are two tags and TB TA.
TA and TB are assigned to credentials cA and cB,

09

Privacy and Trust in Online Interactions

respectively, if the information inferred from cA


is more precise than what can be inferred from
cB. cA determines a possible value set VA for aj,
and cB determines another set VB. The formula to
compute the privacy loss is: (see Box 1.) where lub
is the least upper bound operator (ibid).
C.3. Estimating trust gain: We have already
shown in the section on Trust Metrics the way
to compute a trust gain. It requires defining a
trust benefit function B(ti) associated with each
trust level ti. Then, the trust gain G is calculated
as follows:
trust_gain =
G(new_trust_level, old_trust_level) =
B(new_trust_level) - B(old_trust_level)
C.4. PRETTYa system minimizing privacy loss
for a required trust gain: A prototype system
named PRETTY (private and trusted systems)
was developed in the Raid Laboratory at Purdue
University (Zhong & Bhargava, 2004). PRETTY,
shown in Figure 3, utilizes the server/client architecture. It uses as one of its components the
existing TERA (trust-enhanced role assignment)
prototype, which also was developed in the Raid
Lab (Zhong, Lu, & Bhargava, 2004). TERA evaluates the trust level of a user based on the users
behavior. It decides whether the user is authorized
for an operation based on the policies, the credentials, and the level of trust. The users trust value
is dynamically updated when more data on the
users behavior become available.
The server component consists of the server
application, the TERA server, the privacy negotiator, the set of privacy policies, the database,

and the data disseminator. The client component


of PRETTY consists of the user application, the
credential manager, the evaluator of trust gain
and privacy loss, the privacy negotiator, and a set
of privacy policies. The evaluator of trust gain
and privacy loss implements the mechanisms
presented in this paper. It prompts a user to enter
the utility function for the ongoing interaction. The
evaluator either automatically makes the tradeoff
decision, or provides the user with the evaluation
results for privacy loss and trust gain.
A typical interaction is illustrated in Figure 3
with numbered arrows. An arrow number (1 4,
some with letter suffixes) corresponds to an item
number in the interaction description.
A number in parentheses in the figure denotes
an unconditional path (e.g., (1)), that is, a path
followed by all interactions. A number in brackets
denotes a conditional path (e.g., [2a]), that is,
a path followed only by some interactions. An
interaction takes place as follows:
1.
2.

The user application sends a query to the


server application.
The server application sends user information to the TERA server for trust evaluation
and role assignment. (Examples of roles
are student role and faculty role, the latter
granting its actor broader access rights.)
2a. If a higher trust level is required for
query, the TERA server sends a request
for more users credentials to the local
privacy negotiator.
2b. Based on the servers privacy policies and credential requirements, the
servers privacy negotiator interacts
with the clients privacy negotiator to
build a higher level of trust.

Box 1.
Pr ivacyLossa j (ci | ) = Tag (ci )
Pr ivacyLossa j (ci | R) = lub( PrivacyLossa j (ci | ), Pr ivacyLossa j ( R | ) )

0

Privacy and Trust in Online Interactions

Figure 3. Architecture of PRETTY

3.

4.

2c. The trust gain and privacy loss evaluator


of the client selects credentials that will
increase trust to the required level with
the minimal privacy loss. The calculations consider credential requirements
and credentials disclosed in previous
interactions. (This item includes two
actions: [2c1] and [2c2]).
2d. According to clients privacy policies
and the calculated privacy loss, the
clients privacy negotiator decides
whether or not to supply credentials
to the server.
Once the achieved trust level meets the
minimum requirements, the appropriate
roles are assigned to the user for execution
of the users query.
Based on the query results, the users trust
level and privacy policies, the data disseminator determines: (i) whether to distort data
for protecting its privacy and, if so, to what
degree; and (ii) what privacy enforcement
metadata (e.g., policies) should be associated
with it.

future trends In prIvAcy- And


trust-relAted solutIons
Technical privacy-related and trust-related solutions will continue their strong impact on online
consumer protection. The future trends related to
privacy will be determined, among others, by the
following challenges (Bhargava, Farkas, Lilien,
& Makedon, 2003; Bhargava, 2006):
1.

2.

Defining and measuring privacy and


privacy policies: How to define privacy?
How to define privacy metrics? How to
best define privacy policies? How to best
perform privacy requirement analysis and
stakeholder analysis? How to analyze, and
manage privacy policies?
Determining technologies that endanger
privacy in computing environments: What
technologies or system components endanger privacy in computing environments, and
how to prevent this? As an example, how
to prevent pervasive computing from illegitimate monitoring and controlling people?
How to assure anonymity in pervasive
computing environments? How to balance



Privacy and Trust in Online Interactions

3.

anonymity with accountability under these


circumstances?
Finding new privacy-enhancing technologies

What technologies can be utilized or exploited


to provide privacy, and how to use them to this
end? What are the best ways of privacy-preserving data mining and querying? How to monitor
Web privacy and prevent privacy invasions by
undesirable inferences? How to address the issue
of monitoring the monitor, including identification and prevention of situations when incorrect
monitor data result in a personal harm?
The future trends related to trust will be determined among others, by the following challenges
(Bhargava et al., 2003):

4.

What is the impact of trust solutions on


system performance and economics? How
to guarantee performance and economy of
trust solutions? How and what economic
incentives and penalties can be used for
trust solutions?
Engineering trust-based applications
and systems: How to experiment with and
implement trust-based applications and
systems for e-government, e-commerce,
and other applications? How to enhance
system performance, security, economics,
and so forth, with trust-based ideas (such as
enhancing role-based access control with
trust-based mappings)?

conclusIon
1.

2.

3.



Improving initiation and building of


trust:How to create formal models of trust,
addressing the issues of different types of
trust (e.g., trust towards data subjects, or
data users)? How to define trust metrics
able to compare different trust models? How
should trust models assist in selecting and
accommodating trust characteristics? How
should the models of trust handle both direct
evidence and second-hand recommendations? How TTPs can be used to initiate and
build trust? How timeliness, precision, and
accuracy affect the process of trust building?
Maintaining and evaluating trust: How to
collect and maintain trust data (e.g., credentials, recommendations)? How and when to
evaluate trust data? How to discover betrayal
of trust, and how to enforce accountability
for damaging trust? How to prevent trust
abuse, for example, by preventive revocation
of access rights? How to motivate users to
be good citizens and to contribute to trust
maintenance?
Constructing practical trust solutions:
How to scale up trust models and solutions?

Providing tools for privacy-for-trust exchange is


critical to further developments of online interactions. Without privacy guarantees, there can
be no trust, and without at least some trust no
interactions can even commenceunless a party
is totally oblivious to the dangers of privacy loss,
up to the point of identity theft. Normally, people
will avoid any negotiations if their privacy is
threatened by a prospective negotiation partner.
Without trust-building negotiations, no trust can
be established.
The stakes are becoming higher since privacy
guarantees are becoming absolutely essential as
we progress towards pervasive computing. More
pervasive devices have the higher potential for
violating privacy. Unless adequate technical
privacy controls and privacy-for-trust support
is provided, possibilities of huge privacy losses
will scare people off, crippling the promise of
pervasive computing.
The objective of this chapter was presentation
of an approach and a tool for protecting privacy in
privacy-for-trust exchanges. We presented the notions of privacy and trust, their characteristics, and
their role in online interactions, emphasizing the

Privacy and Trust in Online Interactions

tradeoff between these two phenomena. An overview of problems facing a person wishing to trade
privacy for trust was followed by a description of
our proposed solution. It started with a look at
trust metrics and means for building and verifying
trust, and continued with a presentation of two
privacy metrics: an effective anonymity set size
and an entropy-based metric. We then discussed
technical means for protecting privacy.
We categorized the processes of trading
privacy for trust into same-strength privacy-fortrust negotiations and different-strength privacyfor-trust negotiations, dividing the former into
privacy-revealing and privacy-preserving subcategories. The described privacy-for-trust solution
is intended for optimization in different-strength
trust negotiations. It involves four steps: formulating the tradeoff problem, estimating privacy loss,
estimating trust gain, and minimizing privacy
loss for a required trust gain. We provided a brief
description of PRETTY, a system minimizing
privacy loss for a required trust gain.

2.

3.

future reseArch dIrectIons


for prIvAcy And trust
reseArch
We have shown that privacy and trust enable
and facilitate collaboration and communication.
We indicated their growing role in open and
dynamic environments. To increase the benefits
of privacy-related and trust-related solutions, a
number of research directions should be pursued
(Bhargava et al., 2003). For privacy-related solutions, the following research problems should be
addressed (ibid):
4.
1.

Privacy metrics: Issues of privacy of users


or applications, on the one hand, and privacy
(secrecy, confidentiality) of data, on the other
hand, intertwine. Metrics for personal and
confidential data usage should be developed
to measures that accesses data, what data

are accessed, and how they are accessed.


Metrics and methods for measurements of
privacy-related aspects of data quality should
be provided. Accuracy in information extraction should be measured since inaccurate
information can obstruct accountability or
harm privacy (e.g., in a case of a wrongly
identified individual).
Privacy policy monitoring and validation: We need to better understand how to
monitor and validate privacy policies, and
develop technologies that ensure correct
enforcement of privacy policies. Researchers
should address monitoring and validating
privacy aspects of data integration, separation, warehousing, and aggregation. An
interesting issue is licensing of personal
data for specific uses by their owners (an
example is Ms. Smith agreeing to receive
house-for-sale advertising by licensing her
e-mail rights to a real estate advertiser).
Information hiding, obfuscation, anonymity, and accountability: Researchers
should address different ways of assuring
anonymity via information hiding and obfuscation, ranging from steganography through
location security and hiding message source
and destination from intermediate nodes
to approaches used for digital elections. At
the same time, for accountability, we need
to investigate how to prevent illegitimate
or improper information hiding. We need
models that support accountable anonymity without depending on a trusted third
party. As an example, accountability suffers
when data provenance obfuscation or user
anonymity hinder intruder identification.
New privacy-enabling and privacydisabling technologies: The impact of
emerging technologies on privacy should be
investigated. In particular, broad research on
privacy for pervasive computing is an urgent necessity. Unless proper access control
is provided, less restricted data. Another



Privacy and Trust in Online Interactions

5.

important issue is privacy-preserving data


mining on massive datasets.
Interdisciplinary privacy research: Interdisciplinary research should propose comprehensive and rich privacy models based
on social and ethical privacy paradigms.
Another direction is considering public acceptance of privacy requirements and rules,
and their enforcement.

In turn, for trust-related solutions, the following research problems should be addressed
(ibid):
1.

2.

3.



A better utilization of the social paradigm


of trust: Utilization of the powerful social
paradigm of trust, based on the analogies to
uses of the notion of trust in social systems,
should be explored in many ways. Finding
out what makes trust work in existing social
systems, and transferring this to a computing
world is a big challenge. This work calls for
a strong cooperation with social scientists.
Liability of trust: We need to provide
methods, algorithms, and tools to identify
which components and processes of the
system depend on trust. We also need to
find out to which extent and how security
of a system may be compromised if any of
these trust-dependent components fails.
As an example, the role of data provenance
explanations in trust-based systems needs
be investigated.
Scalable and adaptable trust infrastructure: A high priority should be given to
building scalable and adaptable trust infrastructures, including support for trust
management and trust-based negotiations. In
particular, support should be made available
for gaining insight from different applications, for exploring the issue of dynamic
trust, for building interoperable tools for the
trust infrastructure, for developing flexible
and extensible standards, and for investigating trust-based negotiations.

4.

5.

Benchmarks, testbeds, and development


of trust-based applications: We need
benchmarks and testbeds for experimenting
with diverse roles of trust in computing systems. The experiments should form a strong
basis for the development of diverse prototype trust-based applications. Trust-based
solutions for new and emerging technologies
should be studied. An example is ensuring
data integrity and privacy in sensor networks
deployed in trustless environments.
Interdisciplinary trust research: There is
a strong need for trust-related interdisciplinary research outside of the realm of computer
science and engineering. In addition to already-mentioned interdisciplinary work on
the social paradigm of trust, it should include
research on ethical, social, and legal issues,
both human-centered and system-centered.
Another important direction is work on
economic incentives for building trust, and
disincentives and penalties for committing
fraud.

Trust and privacy are strongly related to security. Therefore, in addition to the separate research
directions for privacy and trust specified, we can
also indicate threads of research common not only
to them, but also to security. This means research
on intersecting aspects of trust, privacy, and
security (TPS) (Bhargava et al., 2003). The first
common thread includes the tradeoffs, including
not only the tradeoff between privacy and trust,
but also performance vs. TPS, cost and functionality vs. TPS, and data monitoring and mining
vs. TPS. The second common thread contains
policies, regulations, and technologies for TPS.
This includes creation of flexible TPS policies,
appropriate TPS data management (including
collection, usage, dissemination, and sharing of
TPS data), and development of domain- and application-specific TPS approaches (such as TPS
solutions for commercial, government, medical,
and e-commerce fields). The third and the fourth

Privacy and Trust in Online Interactions

threads are a development of economic models


for TPS, and investigation of legal and social
TPS aspects.

AcknoWledgment
This research was supported in part by the NSF
Grants IIS-0242840, IIS-0209059, ANI-0219110,
and NCCR-0001788. The authors thank Dr. Yuhui
Zhong and Dr. Yi Lu for contributing input for
the subsections on Related Work, Trust Metrics,
Privacy Metrics, and Privacy-for-trust Optimization in Different-strength Trust Negotiations; as
well as Dr. Mohamed Hefeeda for contributing
Figure 2. Any opinions, findings, conclusions,
or recommendation expressed in the chapter are
those of the authors and do not necessarily reflect
the views of the funding agencies or institutions
with which the authors are affiliated.

references
Aberer, K., & Despotovic, Z. (2001). Managing
trust in a peer-2-peer information system. In
Proceedings of the 2001 ACM CIKM International Conference on Information and Knowledge
Management, Atlanta, Georgia (pp. 310-317).
New York: ACM.
Agrawal, D., & Aggarwal, C. (2001). On the
design and quantification of privacy preserving data mining algorithms. In Proceedings of
the Twentieth ACM SIGMOD-SIGACT-SIGART
Symposium on Principles of Database Systems,
PODS01, Santa Barbara, California (pp. 247-255).
New York: ACM.
The American Heritage Dictionary of the English
Language (4th ed.). (2000). Boston: Houghton
Mifflin.
Barnes, G. R., Cerrito, P. B., & Levi, I. (1998). A
mathematical model for interpersonal relation-

ships in social networks. Social Networks, 20(2),


179-196.
Bhargava, B. (2006, September). Innovative ideas
in privacy research (Keynote talk). In Proceedings of the Seventeenth International Workshop
on Database and Expert Systems Applications
DEXA06, Krakw, Poland (pp. 677-681). Los
Alamitos, CA: IEEE Computer Society.
Bhargava, B., Farkas, C., Lilien, L., & Makedon, F.
(2003, September 14-16). Trust, privacy, and security: summary of a workshop breakout session
at the national science foundation information
and data management (IDM) workshop held in
Seattle, Washington (Tech. Rep. No. 2003-34).
West Lafayette, IN: Purdue University, Center
for Education and Research in Information Assurance and Security (CERIAS).
Bhargava, B., Lilien, L., Rosenthal, A., &
Winslett, M. (2004). Pervasive trust. IEEE Intelligent Systems, 19(5), 74-77.
Bhargava, B., & Zhong, Y. (2002). Authorization
based on evidence and trust. In Y. Kambayashi, W.
Winiwarter, & M. Arikawa (Eds.), Proceedings of
4th International Conference on Data Warehousing and Knowledge Discovery (DaWaK 2002)
Lecture Notes in Computer Science (Vol. 2454,
pp. 94-103). Heidelberg, Germany: Springer.
Can, A. B. (2007). Trust and anonymity in peerto-peer systems. Ph.D. thesis, West Lafayette,
Indiana: Purdue University.
Carbo, J., Molina, J., & Davila, J. (2003). Trust
management through fuzzy reputation. International Journal of Cooperative Information
Systems, 12(1), 135155.
Class FPR: Privacy. (1999). Common criteria
for information technology security evaluation.
Part 2: Security functional requirements. Version
2.1. (Report CCIMB-99-032) (pp.109-118). Ft.
Meade, MD: National Information Assurance
Partnership (NIAP). Retrieved June 5, 2007, from



Privacy and Trust in Online Interactions

http://www.niap-ccevs.org/cc-scheme/cc_docs/
cc_v21_part2.pdf
Collberg, C., & Thomborson, C. (2002). Watermarking, tamper-proofing, and obfuscation-tools
for software protection. IEEE Transactions on
Software Engineering, 28(8), 735-746.
Cofta, P. (2006). Impact of convergence on trust
in ecommerce. BT Technology Journal, 24(2),
214-218.
Cover, T., & Thomas, J. (1991). Elements of
information theory. Hoboken, NJ: John Wiley
& Sons.
Cranor, L. F. (2003). P3P: making privacy policies more useful. IEEE Security and Privacy,
1(6), 5055.
Cranor, L. F., Reagle, J., & Ackerman, M. S.
(1999). Beyond concern: Understanding net users
attitudes about online privacy (Tech. Rep. No. TR
99.4.3). Middletown, NJ: AT&T Labs-Research.
Retrieved June 5, 2007, from http://citeseer.ist.
psu.edu/cranor99beyond.html
Diaz, C., Seys, S., Claessens, J., & Preneel, B.
(2003, April). Towards measuring anonymity. In R. Dingledine & P. F. Syverson, (Eds.),
Proceedings of the 2nd International Workshop
on Privacy Enhancing Technologies PET 2002,
San Francisco, CA. Lecture Notes in Computer
Science (Vol. 2482, pp. 184-188). Heidelberg,
Germany: Springer.
Donnellan, T. (1968). Lattice theory. Oxford, NY:
Pergamon Press.
Farrell, S., & Housley, R. (2002). RFC3281: An
Internet attribute certificate profile for authorization. The Internet Society. Network Working
Group. Retrieved June 5, 2007, from http://www.
ietf.org/rfc/rfc3281.txt
Fischer-Hbner, S. (2001). IT-security and privacydesign and use of privacy-enhancing security
mechanisms. Lecture Notes on Computer Science,
1958. Heidelberg, Germany: Springer.



Fischer-Hbner, S. (2003). Privacy enhancing


technologies. Session 1 and 2, Ph.D. course,
Winter/Spring 2003. Karlstad, Sweden: Karlstad
University. Retrieved June 5, 2007, from http://
www.cs.kau.se/~simone/kau-phd-course.htm
Fujimura, K., & Nishihara, T. (2003). Reputation
rating system based on past behavior of evaluators. In Proceedings of the 4th ACM Conference
on Electronic Commerce, San Diego, CA, USA
(pp. 246-247). New York: ACM Press.
Garfinkel, T., Pfaff, B., Chow, J., Rosenblum, M.,
& Boneh, D. (2003). Terra: A virtual machinebased platform for trusted computing. In Proceedings of 19th ACM Symposium on Operating
Systems Principles, SOSP 2003, Bolton Landing,
New York (pp. 193-206). New York: ACM Press.
Retrieved June 5, 2007, from http://citeseer.ist.
psu.edu/667929.html
Goldberg, I. (2000). A pseudonymous communications infrastructure for the Internet. Ph.D. thesis,
University of California at Berkeley. Retrieved
June 5, 2007, from http://www.isaac.cs.berkeley.
edu/ iang/thesis-final.pdf
Green, A. M. (2003). International privacy laws.
Sensitive information in a wired world (Tech. Rep.
No. CS 457). New Haven, CT: Yale University.
IBM Privacy Research Institute. (2007). Armonk,
NY: IBM. Retrieved June 5, 2007, from http://
www.research.ibm.com/privacy/
Internet Security Glossary. (2007). The Internet
Society. Retrieved June 5, 2007, from www.faqs.
org/rfcs/rfc2828.html
Jensen, F.V. (1996). An introduction to Bayesian
networks. London: UCL Press.
Kelley, C. M., Denton, A., & Broadbent, R. (2001).
Privacy concerns cost ecommerce $15 billion.
Cambridge, MA: Forrester Research.
Lilien, L., & Bhargava, B. (2006). A scheme for
privacy-preserving data dissemination. IEEE

Privacy and Trust in Online Interactions

Transactions on Systems, Man and Cybernetics,


Part A: Systems and Humans, 36(3), 503-506.
Lilien, L., Gupta, A., & Yang, Z. (2007). Opportunistic networks for emergency applications
and their standard implementation framework. In
Proceedings of the First International Workshop
on Next Generation Networks for First Responders and Critical Infrastructure, NetCri07, New
Orleans, LA (pp. 588-593).
Lilien, L., Kamal, Z. H., Bhuse, V., & Gupta, A.
(2006). Opportunistic networks: The concept and
research challenges in privacy and security. In P.
Reiher, K. Makki, & S. Makki (Eds.), Proceedings of the International Workshop on Research
Challenges in Security and Privacy for Mobile
and Wireless Networks (WSPWN06), Miami, FL
(pp. 134-147).
Marchiori, M. (2002). The platform for privacy
preferences 1.0 (P3P1.0) specification. W3C
recommendation. W3C. Retrieved June 5, 2007,
from http://www.w3.org/TR/P3P/
Marsh, S. (1994). Formalizing trust as a computational concept. Ph.D. thesis, University of
Stirling, U.K.
McKnight, D. H., & Chervany, N. L. (2001). Conceptualizing trust: A typology and e-commerce
customer relationships model. In Proceedings of
the 34th Annual Hawaii International Conference
on System Sciences, HICSS-34, Island of Maui,
Hawaii (Volume 7, 10 pages). Washington, D.C.:
IEEE Computer Society. Retrieved June 5, 2007,
from http://csdl2.computer.org/comp/proceedings/hicss/2001/0981/07/09817022.pdf
McKnight, D., Choudhury, V., & Kacmar, C.
(2002). Developing and validating trust measures
for e-commerce: An integrative topology. Information Systems Research, 13(3), 334-359.
Mercuri, R. T. (2004). The HIPAA-potamus in
health care data security. Communications of the
ACM, 47(7), 25-28.

Morinaga, S., Yamanishi, K., Tateishi, K., & T.


Fukushima, T. (2002). Mining product reputations on the Web. In Proceedings of the 8th ACM
SIGKDD International Conference on Knowledge
Discovery and Data Mining (pp. 341349). New
York: ACM Press. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/morinaga02mining.html
Mui, L. (2002). Computational models of trust
and reputation: Agents, evolutionary games,
and social networks. Ph.D. thesis, Massachusetts
Institute of Technology.
Mui, L., Mohtashemi, M., & Halberstadt, A.
(2002). A computational model of trust and
reputation for e-businesses. In Proceedings of
the 35th Annual Hawaii International Conference
on System Sciences, HICSS02, Track 7, Island
of Hawaii, Hawaii (9 pages). Washington, D.C.:
IEEE Computer Society. Retrieved June 5, 2007,
from http://csdl2.computer.org/comp/proceedings/hicss/2002/1435/07/14350188.pdf
Pfitzmann, A., & Khntopp, M. (2000). Anonymity, unobservability, and pseudonymitya
proposal for terminology. In H. Federrath (Ed.),
Designing privacy enhancing technologies,
Proceedings of the Workshop on Design Issues
in Anonymity and Observability, Berkeley, California. Lecture Notes in Computer Science (Vol.
2009, pp. 1-9). Heidelberg, Germany: Springer.
Privacy Act. (2004). Washington, D.C.: U.S. Environmental Protection Agency. Retrieved June
5, 2007, from http://www.epa.gov/privacy/
Privacy Bird Tour. (2007). Retrieved June 5, 2007,
from http://www.privacybird.org/tour/1_3_beta/
tour.html
Privacy online: A report to congress. (1998).
Washington, D.C.: U.S. Federal Trade Commission.
Pujol, J. M., Sangesa, R., & Delgado, J. (2002).
Extracting reputation in multi agent systems by
means of social network topology. In Proceed-



Privacy and Trust in Online Interactions

ings of the First International Joint Conference


on Autonomous Agents and Multiagent Systems,
AAMAS 02, Bologna, Italy (pp. 467-474). New
York: ACM Press. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/pujol02extracting.html
Reiter, M., & Rubin, A. (1999). Crowds: Anonymity for Web transactions. Communications of the
ACM, 42(2), 32-48.
Ross, R. (2007, July 19). Robotic insect takes
off. Researchers have created a robotic fly for
covert surveillance. Technology Review, July
19. Retrieved July 20, 2007, from http://www.
technologyreview.com/Infotech/19068/
Sabater, J., & Sierra, C. (2002). Social ReGreT, a
reputation model based on social relations. ACM
SIGecom Exchanges, 3(1), 44-56.
Seigneur, J.-M., & Jensen, C. D. (2004). Trading
privacy for trust. In T. Dimitrakos (Ed.), Proceedings of the Second International Conference on
Trust Management, iTrust 2004, Oxford, UK.
Lecture Notes in Computer Science (Vol. 2995,
pp. 93-107). Heidelberg, Germany: Springer.
Sensor Nation. (2004). [Special Report]. IEEE
Spectrum, 41(7).
Serjantov, A., & Danezis, G. (2003). Towards an
information theoretic metric for anonymity. In R.
Dingledine & P. F. Syverson. (Eds.), Proceedings
of the 2nd International Workshop on Privacy Enhancing Technologies, PET 2002, San Francisco,
California. Lecture Notes in Computer Science
(Vol. 2482, pp. 259-263). Heidelberg, Germany:
Springer.
Summary of the HIPAA Privacy Rule. (2003).
Washington, D.C.: The U.S. Department of Health
and Human Services.
Sweeney, L. (1996). Replacing personally-identifying information in medical records, the Scrub
system. In J. J. Cimino (Ed.), Proceedings of the
American Medical Informatics Association (pp.
333-337). Washington, D.C.: Hanley & Belfus.

8

Retrieved June 5, 2007, from http://privacy.cs.cmu.


edu/people/sweeney/scrubAMIA1.pdf
Sweeney, L. (1998). Datafly: A system for providing anonymity in medical data. In T. Y. Lin &
S. Qian (Eds.), Database security XI: Status and
prospects. IFIP TC11 WG11.3 Eleventh International Conference on Database Security, Lake
Tahoe, California (pp. 356-381). Amsterdam, The
Netherlands: Elsevier Science.
Sweeney, L. (2001a). Computational disclosure
control: A primer on data privacy protection.
Ph.D. thesis, Massachusetts Institute of Technology.
Sweeney, L. (2001b). Information explosion. In L.
Zayatz, P. Doyle, J. Theeuwes, & J. Lane (Eds.),
Confidentiality, disclosure, and data access:
Theory and practical applications for statistical
agencies (26 pages). Washington, D.C.: Urban Institute. Retrieved June 5, 2007, from http://privacy.
cs.cmu.edu/people/sweeney/explosion2.pdf
Sweeney, L. (2002a). Achieving k-anonymity
privacy protection using generalization and suppression. International Journal on Uncertainty,
Fuzziness and Knowledge-based Systems, 10(5),
571-588.
Sweeney, L. (2002b). k-anonymity: a model for
protecting privacy. International Journal on
Uncertainty, Fuzziness and Knowledge-based
Systems, 10(5), 557570.
Trustworthy Computing White Paper. (2003).
Redmond, Washington: Microsoft. Retrieved
June 5, 2007, from http://www.microsoft.com/
mscorp/twc/twc_whitepaper.mspx
Tschudin, C. (1999). Apoptosisthe programmed
death of distributed services. In J. Vitek & C.
D. Jensen (Eds.), Secure Internet programming.
Security issues for mobile and distributed objects.
Lecture Notes in Computer Science (Vol. 1603,
pp. 253-260). Heidelberg, Germany: Springer.

Privacy and Trust in Online Interactions

Tygar, J. D., & Yee, B. (1994). Dyad: A system


for using physically secure coprocessors. In Proceedings of the Joint Harvard-MIT Workshop
Technological Strategies for Protecting Intellectual Property in the Networked Multimedia
Environment. Annapolis, MD: Interactive Multimedia Association. Retrieved July 20, 2007, from
http://www.cni.org/docs/ima.ip-workshop/Tygar.
Yee.html
Wagealla, W., Carbone, M., English, C., Terzis,
S., Lowe, H. & Nixon, P. (2003, September). A
formal model for trust lifecycle management. In
Proceedings of the 1st International Workshop
on Formal Aspects of Security and Trust, FAST
2003, Pisa, Italy, (pp. 181-192). Retrieved July 20,
2007, from http://www.iit.cnr.it/FAST2003/fastproc-final.pdf
Welcome to the Safe Harbor. (2007). Trade Information Center. Retrieved June 5, 2007, from
http://www.export.gov/safeharbor/
Young, J. F. (1971). Information theory. New York:
Wiley Interscience.
Yu, B., & Singh, M. P. (2002a). Distributed
reputation management for electronic commerce.
Computational Intelligence, 18(4), 535-549.
Yu, B., & Singh, M. P. (2002b). An evidential
model of distributed reputation management.
In Proceedings of the First International Joint
Conference on Autonomous Agents and Multiagent Systems, AAMAS 02, Bologna, Italy, (pp.
294301). New York: ACM Press.
Yu, T., Winslett, M., & Seamons, K. E. (2003).
Supporting structured credentials and sensitive
policies through interoperable strategies for automated trust negotiation. ACM Transactions on
Information and System Security, 6(1), 1-42.
Zacharia, G., & Maes, P. (2000). Trust management through reputation mechanisms. Applied
Artificial Intelligence, 14, 881-907.

Zhong, Y. (2005). Formalization of trust. Ph.D.


thesis, West Lafayette, IN: Purdue University.
Zhong, Y., & Bhargava, B. (2004, September).
Using entropy to trade privacy for trust. In Proceedings of the NSF/NSA/AFRL Conference on
Secure Knowledge Management, SKM 2004,
Amherst, NY (6 pages).
Zhong, Y., Lu, Y., & Bhargava, B. (2004). TERA:
An authorization framework based on uncertain
evidence and dynamic trust (Tech. Rep. No.
CSD-TR 04-009). West Lafayette, IN: Purdue
University.
Zhong, Y., Lu, Y., Bhargava, B., & Lilien, L.
(2006). A computational dynamic trust model
for user authorization (Working Paper). West
Lafayette, IN: Purdue University.

AddItIonAl reAdIng In prIvAcy


And trust
Antoniou, G., Wilson, C., & Geneiatakis, D.
(2006, October). PPINAa forensic investigation protocol for privacy enhancing technologies.
In H. Leitold & E. Leitold (Eds.), Proceedings
of the 10th IFIP International Conference on
Communications and Multimedia Security, CMS
2006, Heraklion, Crete, Greece. Lecture Notes
in Computer Science (Vol. 4237, pp. 185-195).
Heidelberg, Germany: Springer
Bauer, K., McCoy, D., Grunwald, D., Kohno,
T., & Sicker D. (2007). Low-resource routing
attacks against anonymous systems (Tech. Rep.
No. CU-CS-1025-07). Boulder, CT: University of
Colorado at Boulder. Retrieved June 5, 2007, from
http://www.cs.colorado.edu/department/publications/reports/docs/CU-CS-1025-07.pdf
Bearly, T., & Kumar, V. (2004). Expanding trust
beyond reputation in peer-to-peer systems. In
Proceedings of the 15th International Database

9

Privacy and Trust in Online Interactions

and Expert Systems Applications, DEXA04,


Zaragoza, Spain (pp. 966-970).
Bouguettaya, A. R. A., & Eltoweissy, M. Y. (2003).
Privacy on the Web: facts, challenges, and solutions. IEEE Security and Privacy, 1(6), 40-49.
Cahill, V., Gray, E., Seigneur, J.-M., Jensen, C.D.,
Chen, Y., English, C., et al. (2003). Using trust for
secure collaboration in uncertain environments.
IEEE Pervasive Computing, 2(3), 52-61.
Capra, L. (2004, November). Engineering human
trust in mobile system collaborations. In Proceedings of the 12th International Symposium on the
Foundations of Software Engineering, SIGSOFT
2004, Newport Beach, California, (pp. 107-116).
Washington, D.C.: IEEE Computer Society.
Chaum, D. (1981). Untraceable electronic mail,
return addresses, and digital pseudonyms. Communications of the ACM, 24(2), 84-90. Retrieved
June 5, 2007, from http://world.std.com/~franl/
crypto/chaum-acm-1981.html
Frosch-Wilke, D. (2001, June). Are e-privacy and
e-commerce a contradiction in terms? An economic examination. In Proceedings of Informing
Science Challenges to Informing Clients: A Transdisciplinary Approach, Krakow, Poland, (pp. 191197). Retrieved June 5, 2007, from http://www.
informingscience.org/proceedings/IS2001Proceedings/pdf/FroschWilkeEBKAreEP.pdf
Grandison, T., & Sloman, M. (2000). A survey of
trust in Internet applications. IEEE Communications Surveys, 3(4), 2-16.
Hubaux, J.-P., Capkun, S., & Luo, J. (2004). The
security and privacy of smart vehicles. IEEE
Security and Privacy Magazine, 2(3), 49-55.
Kagal, L., Finin, T., & Joshi, A. (2001). Trust-based
security in pervasive computing environments.
IEEE Computer, 34(12), 154-157.
Kesdogan, D., Agrawal, D., & Penz, S. (2002).
Limits of anonymity in open environments. In

0

F. A. P. Petitcolas (Ed.), Proceedings of the 5th


International Workshop on Information Hiding
(IH 2002), Noordwijkerhout, The Netherlands.
Lecture Notes in Computer Science (Vol. 2578, pp.
53-69). Heidelberg: Germany: Springer. Retrieved
June 5, 2007, from http://www-i4.informatik.
rwth-aachen.de/sap/publications/15.pdf
Kpsell, S., Wendolsky, R., & Federrath, H.
(2006). Revocable anonymity. In Proceedings of
the 2006 International Conference on Emerging
Trends in Information and Communication Security, ETRICS, Freiburg, Germany. Lecture Notes
in Computer Science (Vol. 3995, pp. 206-220).
Heidelberg: Germany: Springer
Kuntze, N., & Schmidt, A. U . (2006). Transitive trust in mobile scenarios. In Proceedings of
the 2006 International Conference on Emerging Trends in Information and Communication
Security, ETRICS, Freiburg, Germany, Lecture
Notes in Computer Science (Vol. 3995, pp. 73-85).
Heidelberg: Germany: Springer.
Lam, S. K., Frankowski, D., & Riedl, J. (2006,
June). Do you trust your recommendations? An
exploration of security and privacy issues in recommender systems. In Proceedings of the 2006
International Conference on Emerging Trends
in Information and Communication Security,
ETRICS, Freiburg, Germany, Lecture Notes in
Computer Science (Vol. 3995, pp. 14-29). Heidelberg: Germany: Springer.
Langheinrich, M. (2001). Privacy by design
principles for privacy-aware ubiquitous systems.
In G. D. Abowd, B. Brumitt, & S. Shafer (Eds.),
Proceedings of the 3rd International Conference on Ubiquitous Computing, UbiComp 2001,
Atlanta, Georgia, Lecture Notes in Computer
Science (Vol. 2201, pp. 273-291). Heidelberg,
Germany: Springer.
Lu, Y., Wang, W., Bhargava, B., & Xu, D. (2006).
Trust-based privacy preservation for peer-to-peer
data sharing. IEEE Transactions on Systems, Man
and Cybernetics, Part A, 36(3), 498-502.

Privacy and Trust in Online Interactions

Manchala, D. W. (2000). E-commerce trust


metrics and models. IEEE Internet Computing,
4(2), 36-44.
Martin, D. M., Jr., Smith, R. M., Brittain, M.,
Fetch, I., & Wu, H. (2001). The privacy practices
of Web browser extensions. Communications of
the ACM, 44(2), 45-50.
Nielson, S. Crosby, S., & Wallach, D. (2005). A
taxonomy of rational attacks. In M. Castro & R.
Renesse (Eds.), Proceedings of the 4th International Workshop on Peer-to-Peer Systems, IPTPS
05, Ithaca, New York, Lecture Notes in Computer
Science, (Vol. 3640, pp. 36-46). Heidelberg, Germany: Springer.
Resnick, P., Kuwabara, K., Zeckhauser, R. &
Friedman, E. (2000). Reputation systems. Communications of the ACM, 43(12), 45-48.
Richardson, M., Agrawal, R., & Domingos, P.
(2003). Trust management for the Semantic Web.
In Proceedings of the 2nd International Semantic
Web Conference. Lecture Notes in Computer
Science, (Vol. 2870, pp. 351368). Heidelberg,
Germany: Springer.

Theodorakopoulos, G., & Baras, J. S. (2006). On


trust models and trust evaluation metrics for ad
hoc networks. IEEE Journal on Selected Areas
in Communications, 24(2), 318-328.
Thuraisingham, B. (2007, June). Confidentiality, privacy and trust policy enforcement for the
Semantic Web. In Proceedings of the Eighth
IEEE International Workshop on Policies for
Distributed Systems and Networks, POLICY07,
Bologna, Italy, (pp. 8-11). Washington, DC: IEEE
Computer Society.
Virendra M., Jadliwala, M., Chandrasekaran, M.,
& Upadhyaya, S. (2005, April). Quantifying trust
in mobile ad-hoc networks. In Proceedings of the
IEEE International Conference on Integration
of Knowledge Intensive Multi-Agent Systems,
KIMAS 2005, Boston, MA, (pp. 65-70). Washington, D.C.: IEEE Computer Society.
Westin, A. (1967). Privacy and freedom. New
York: Atheneum.
Westin, A. (2003). Social and political dimensions of privacy. Journal of Social Issues, 59(2),
431-453.

Rousseau, D. M., Sitkin, S. B., Burt, R. S., &


Camerer, C. (1998). Not so different after all: A
cross-discipline view of trust. Academic Management Review, 23(3), 393-404.

Xiao, L., Xu, Z., & Zhang, X. (2003). Low-cost and


reliable mutual anonymity protocols in peer-topeer networks. IEEE Transactions on Distributed
Systems, 14(9), 829-840.

Sandberg, C. K. (2002). Privacy and customer


relationship management: can they peacefully
co-exist. William Mitchell Law Review, 28(3),
1147-1162.

Yao, D., Frikken, K., Atallah, M., & Tamassia,


R. (2006, December). Point-based trust: Define
how much privacy is worth. In Q. Ning & Li
(Eds.), Proceedings of the Eighth International
Conference on Information and Communications
Security, ICICS 06, Raleigh, North Carolina,
Lecture Notes in Computer Science, (Vol. 4307,
pp. 190-209). New York: ACM Press.

Skogsrud, H., Benatallah, B., & Casati, F. (2003).


Model-driven trust negotiation for Web services.
IEEE Internet Computing, 7(6), 45-52.
Squicciarini, A. C., Bertino, E., Ferrari, E., & Ray,
I. (2006). Achieving privacy in trust negotiations
with an ontology-based approach. IEEE Transactions on Dependable and Secure Computing,
3(1), 13-30.

Zhang, N., & Todd, C. (2006, October). A privacy


agent in context-aware ubiquitous computing
environments. In H. Leitold & E. Leitold (Eds.),
Proceedings of the 10th IFIP International Conference on Communications and Multimedia



Privacy and Trust in Online Interactions

Security, CMS 2006, Heraklion, Crete, Greece,


Lecture Notes in Computer Science, (Vol. 4327,
pp. 196-205). Heidelberg, Germany: Springer.
Zhu, H., Bao, F., & Liu, J. (2006). Computing of
trust in ad-hoc networks. In H. Leitold & E. Leitold
(Eds.), Proceedings of the 10th IFIP International
Conference on Communications and Multimedia
Security, CMS 2006, Heraklion, Crete, Greece,
Lecture Notes in Computer Science, (Vol. 4237,
pp. 1-11). Heidelberg, Germany: Springer.

endnotes
1



A successful construction of a cyberfly or


the first robot to achieve liftoff that's modeled on a fly and built on such a small scale
was just reported (Ross, 2007).
This includes politics. A Russian proverb
Trust but verify was made famous in the
mid 1980s by President Reagan, at the start
of the historic negotiations with General
Secretary Gorbachev.
A special case of protection of this kind is
preventing any forwarding of disseminated
data by any party that received it directly
from their original source.



Chapter VI

Current Measures to Protect


E-Consumers Privacy in
Australia
Huong Ha
Monash University, Australia
Ken Coghill
Monash University, Australia
Elizabeth Ann Maharaj
Monash University, Australia

AbstrAct
The current measures to protect e-consumers privacy in Australia include (i) regulation/legislation, (ii)
guidelines, (iii) codes of practice, and (iv) activities of consumer associations and the private sector.
However, information about the outcomes of such measures has not been sufficiently reported, whereas
privacy incidents have increased. Some policy implications for e-consumer protection are drawn from
the analysis. Firstly, national privacy legislation should widen its coverage. Secondly, uniform regulations and guidelines could contribute to providing equal protection to e-consumers. Thirdly, guidelines
and codes of practice need to be supported by legislation and a proper compliance regime. Corporate
social responsibility by e-retailers is also required for effective adoption of self-regulatory measures.
Fourthly, consumer education is important to enhance consumer awareness of online privacy risks and
their ability to deal with such incidents. Finally, a combination of legal frameworks, technological, and
human-behaviour related measures is more likely to address online privacy issues effectively.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Current Measures to Protect E-Consumers Privacy in Australia

IntroductIon
E-retailing has generated many benefits to both
e-retailers and e-consumers. At the same time,
it has also raised serious problems for the operation of the online market, especially consumer
protection. Among several problems with online
shopping, privacy concerns are key factors which
discourage consumers from shopping online
(Stoney & Stoney, 2003).
These concerns have been addressed by a
number of measures at both the international and
national levels. However, insufficient information
about these measures and the outcomes of such
measures has been reported.
This chapter examines the current measures to
protect consumers privacy in the online market,
using Australia as a case study; examines the
current state of e-consumer protection regarding
privacy; and discusses policy implications for the
protection of e-consumers privacy.
This chapter consists of four main sections.
The first section introduces three main privacy
issues, namely data security, spam/spim, and
spyware. The second section examines several
measures implemented at the international and
national levels to address privacy issues. In Australia, these measures include (i) legislation, (ii)
guidelines, (iii) codes of practice, (iv) initiatives by
the private sector, and (v) activities by consumer
associations. The effectiveness of the current
measures to address privacy concerns has been
examined in the third section by analysing the
current state of e-consumer protection in terms of
privacy. This section also discusses a case study,
using Dell as a subject of investigation. The final
section discusses the policy implications.
The findings suggest that although legislation,
guidelines, and codes of practice are available,
the effectiveness of these measures is limited.
Consumers are not confident to shop online
due to privacy and security concerns. Also, the
protection of consumers personal information
depends on how e-retailers exercise their corpo-



rate social responsibility to provide protection to


e-consumers.
The chapter aims to contribute to the development of theoretical understanding relating to
regulations, guidelines, industry codes of conduct,
and initiatives by the private sector to protect econsumers privacy. Its also provides an insight
into measures addressing privacy concerns and
how these measures could be improved to enhance
consumer confidence in the online market.

bAckground
This section first discusses three sub-issues of
concern in the protection of e-consumers privacy.
It then introduces the concept of consumer rights,
and discusses justification for e-consumer protection. It also analyses the current framework for
e-consumer protection regarding privacy.

privacy Issues
Privacy is one of the key issues in e-consumer
protection (Stoney & Stoney, 2003; Consumers
International, 2001; Jackson, 2003; Kehoe, Pitkow,
Sutton, Aggarwal, & Rogers, 1999). Internet users
are very concerned about how their personal and
financial data and medical history are collected,
used and disseminated (Consumers International,
2001; Jackson, 2003; Kehoe, Pitkow, Sutton, Aggarwal, & Rogers, 1999; Moghe, 2003). Many
consumers are very reluctant to reveal their
particulars because they do not want e-retailers
to misuse their personal information. However,
by adopting advanced technology, e-retailers can
easily collect personal and financial details of
e-consumers (Lynch, 1997). In addition, many
Web sites request e-shoppers to register or accept cookies which can help in tracking their
Internet itinerary (Yianakos, 2002). Privacy risks
can become a greater danger when e-retailers
share common databases (Egger, 2002). To make
things worse, many e-retailers have not published

Current Measures to Protect E-Consumers Privacy in Australia

privacy policies on their Web sites (Consumer Affairs Victoria, 2003; Consumer Affairs Victoria,
2004). For example, only 28% of the Web sites,
investigated in the Internet Sweep Day 2001, had
privacy policies (Australian Competition and
Consumer Commission, 2003).
This chapter focuses on three sub-issues,
namely data security, spam/spim, and spyware, affecting privacy due to their relevance
to e-consumer protection. Firstly, data security
refers to the security of personal and financial
information during the collection, usage, transmission, and retention stages of e-transactions.
Personal identity includes name, residential and
postal address, driving license, date of birth, social
security number, health card number, passport
number, birth certificate, contact number, contact and place of employment, mothers maiden
name, employee or student identification number,
and e-mail address of the workplace (Lawson &
Lawford, 2003; Milne, 2003). Identity can also
be username, password, , cryptographic keys,
physical devices such as dongles, swipe cards, or
even biometric recognition (Marshall & Tompsett, 2005). Financial information includes bank
account number, credit card number, password,
personal identification number (PIN), and tax
file number (Lawson & Lawford, 2003; Milne,
2003). Identity theft occurs when a customers
financial and personal information is illegally
collected and used by unscrupulous retailers
or unauthorised person in order to impersonate
another for personal gain or committing a fraud
(Grabosky, Smith, & Dempsey, 2001; Lawson
& Lawford, 2003; Marshall & Tompsett, 2005;
Milne, Rohm, & Bahl, 2004; Smith, 2004). It has
been noted that more cases of identity theft have
been committed via electronic means (Grabosky,
Smith, & Dempsey, 2001; Ha, 2005; Metz, 2005).
Information can be unlawfully obtained during
the transmission process. Employees, independent
hackers, criminal individuals, and organised
crime rings, business competitors, saboteurs,
and cyber terrorists are possible intruders (Crime

and Misconduct Commission Queensland, 2004;


Ha, 2005; Ha, 2006). Information can also be
attained from a dishonest e-retailer or an online
bank officer who illegally share customers information with others (Egger, 2002). For instance,
the Financial Institution Online Fraud Survey in
2004 reported that 40% of bank officers shared
passwords (Cyota, 2005). Information can also
be illegally acquired during the retention stage,
especially when the storage of data is connected
to the Internet (Milloy, Fink, & Morris, 2002).
Other ways to unlawfully obtain information
online are the use of computer viruses/worms,
spyware, phreaking, smurfing, and account harvesting. Phreaking refers to the hacking of a
telecommunication system to obtain free phone
services (U.S. Senate Permanent Subcommittee
on Investigations, 1986). Hacking refers to the
illegal breaking into a computer system for various malicious purposes (Krone, 2005). Smurfing
refers to the use of smurf program to use internet
protocol and Internet control message protocol to
send a request using a packet Internet gopher to an
Internet host to test its response (Krone, 2005).
Account harvesting refers to the collection of
e-mail accounts from information in the public
domain or by using spybots to search for e-mail
addresses stored locally on a computer (Krone,
2005). However, this chapter does not examine
these heavily technical forms of computer offences
(Ha, 2007, unpublished).
Secondly, spam occurs as a result of personal
e-mail addresses being divulged, and is thus a
subset of privacy concerns. In this case, users
e-mail addresses are either stolen randomly from
the Internet or are provided to other parties by eretailers or authorised persons in their companies,
then used for unsolicited commercial e-mails
(UCE) or unsolicited bulk e-mails (UBE)
(Quo, 2004).
Spam usually promotes prohibited or undesirable content which involves scams, hoaxes,
deception and advertisements of financial products, sexual sites, and unaccredited educational



Current Measures to Protect E-Consumers Privacy in Australia

programs (Cheng, 2004; James & Murray, 2003;


OECD, 2006). Spam takes a lot of computer
resource (i.e., memory), and thus reduces productivity and annoys Internet users with unsolicited
e-mails and pop-up advertisements. Spam spreads
computer viruses and worms and wastes users
time to filter and delete junk messages (Australian
Direct Marketing Association (ADMA), 2005;
Cheng, 2004), causing a huge financial and productivity losses to industry (Ha, 2007; MacRae,
2003). A new version of spam, called spim, targets
instant messaging (IM) services and spreads
undesirable content such as pornography (Australian Institute of Criminology, 2006).
Finally, spyware refers to software that is
used to steal information related to identification,
password, and PIN numbers during a transaction
process (Australian Federal Police (AFP), 2007).
It can monitor and gather sensitive financial or
medical information from Internet users. It can
also change users browser settings or home pages
which may direct users to other Web sites (Majoras, Swindle, Leary, Harbour, & Leibowitz, 2005).
However, many consumers do not have adequate
knowledge about it (Zhang, 2005). Only 45.6% of
respondents in a survey by Zhang (2005) in 2005
were aware that their PCs could be infected by
spyware. Therefore, there is an urgent need to
raise e-consumers awareness of spyware, and
how to deal with the online privacy risks and for
societies and groups to establish and define new
norms of behaviour (Kaufman, Edlund, Ford,
& Powers, 2005).
Overall, e-consumers are not confident of
e-retailers practices regarding privacy protection. A survey by Roy Morgan Research (2004)
reported that 81% of the respondents believed
that customer details of a business are often
transferred or sold to another business. Consumers only want to shop online and provide their
personal and information to e-retailers when they
trust such e-retailers (Grabner-Kraeuter, 2002;
Kim, Williams, & Lee, 2003; Mansoorian, 2006;
Saarenp & Tiainen, 2003; Walczuch, Seelen,



& Lundgren, 2001). Hence, e-retailers need to


keep customers information confidential and
they are, by law, not allowed to use customers
personal information for other promotion activities without customers prior consent (Ha, 2007,
unpublished). The analysis shows that privacy is a
problem of both technology and human behaviour
(Yianakos, 2002). Thus, the priority task is to find
the best set of motivators and inducements to
deal with simple challenges which are made
more complicated by the networked world
(Yianakos, 2002).

consumer rights and consumer


protection
The four basic consumer rights, introduced in
1962 by the then USA President John Kennedy,
aim to empower consumers in commercial,
social, and economic activities (Akenji, 2004;
Consumers International, 2006). They are the
rights to (i) safety, (ii) information, (iii) choice,
and (iv) representation (Akenji, 2004; Consumers
International, 2006).
These rights were adopted by international
consumer organisations and another four rights
were added as a result of the development of
consumer movement led by Consumers International (USA) (Consumers International, 2006).
The United Nations adopted the eight basic
consumer rights in 1985, including the right to:
(i) safety, (ii) be informed, (iii) choose, (iv) be
heard, (v) satisfaction of basic needs, (vi) redress,
(vii) consumer education, and (viii) a healthy
environment (Akenji, 2004; Choice, 2006; Consumers International, 2006; Federal Bureau of
Consumer Affairs (Australia), 1993; Huffmann,
2004; Mansor, 2003; NSW Office of Fair Trading,
2003; Singh, 2002).
These rights have been the basic foundation
for countries to establish their own principles for
consumer protection. However, these rights apply to consumers in both the online and off-line
market without distinction for e-consumers. Ac-

Current Measures to Protect E-Consumers Privacy in Australia

cording to these rights, e-consumers consumers


are entitled to be further protected against unsolicited communication . invasion of privacy
(Huffmann, 2004).
Therefore, consumer protection aims to
protect the interest(s) of consumers in trade and
commerce (Quirk & Forder, 2003). Consumer
protection is one of the tools to implement the
eight universal basic consumer rights. It refers to
activities which police the market against acts
and practices that distort the manner in which
consumers make decisions in the marketplace
(Muris, 2002).
The development of the online global marketplace has entailed a number of consumer
protection issues in Australia which have been
uniquely e-related, that is, some issues only
occur in the online market (Round & Tustin,
2004). Consumer protection is important in eretailing for a number of reasons. Firstly, there
has been a lack of respect for [consumer] rights
and their safety online and many e-consumers
do not know what kind of protection they have
been entitled to receive, how and by whom they
have been protected (Scottish Consumer Council,
2001). It means the first consumer right to safety
may not be met. Secondly, e-consumers seem to
be disadvantaged in commercial transactions in
terms of lack of choice in contracts and lack
of understanding of terms and conditions, and
lack of knowledge about alternative choice in an
e-contract (Petty & Hamilton, 2004). The second
and third consumer rights to receive information
and to choose may be violated if these above issues
are not properly addressed. Consumers have to
accept all terms and conditions, known as bundle
consent which usually includes the option for
receiving further marketing information, if they
want to proceed with an online purchase (Ha,
2007, unpublished). Thirdly, consumers have to
take more risks when conducting online transactions because customers cannot contact e-retailers

physically before making decisions to buy online.


They also have to pay in advance and provide their
personal and financial information (Huffmann,
2004). Nevertheless, they could not be assured that
their privacy is sufficiently protected. Consumer
rights to receive adequate protection in terms of
privacy are not respected in practice.

the current measures to protect


e-consumers privacy
Currently, privacy issues associated with consumer protection in e-retailing have been addressed
by a mixture of measures at the international
and national levels. This section briefly discusses
legislation and guidelines introduced by national
and supra-national organisations to protect consumers privacy. It also analyses other measures
which are employed by different organisations
in both the public and private sectors to address
privacy issues in e-retailing.

Table 1. Summary of international and national


guidelines and legislation for consumer protection in terms of privacy (based on information in
Harland, 1987; North American Consumer Project
on Electronic Commerce, 2006)
Organisation

Guidelines and Legislation

UN

Universal Declaration of Human Rights 1948


Guidelines for Consumer Protection 1985
(expanded in 1999)

OECD

Guidelines on the Protection of Privacy and


Trans-border Flows of Personal Data 1980
Guidelines for Consumer Protection in the
Context of Electronic Commerce 1999

EU
APEC

Data Protection Directive (EU) (95/46/EC)


APEC Voluntary Online Consumer Protection
Guidelines 2002

Sources: summarised from North American Consumer Project on Electronic Commerce (NACPEC). (2006). Internet
Consumer Protection Policy Issues. Geneva: The Internet
Governance Forum (IGF).
Harland, D. (1987). The United Nations Guidelines for
Consumer Protection Journal of Consumer Policy 10(2),
245-266.



Current Measures to Protect E-Consumers Privacy in Australia

International level
The following section discusses legislation and
guidelines relating to consumer protection by the
UN, the OECD, the EU, and APEC (Table 1).

The United Nations (UN)


According to Article 12 in the Universal Declaration of Human Rights proclaimed by the General
Assembly of the United Nations in 1948, everyone
has the right to be protected against interference
with his privacy (Australian Privacy Foundation
Inc, 2006; United Nations, 1948). This document
acknowledges the universal right of human beings
regarding privacy.
The UN Guidelines for Consumer Protection
1985, adopted on April 9, 1985 (Harland, 1987),
are an extension of the UN basic consumer rights
(see Appendix 1). These provide guidance for
governments of member countries to develop,
strengthen, and modify (if necessary) the legal
framework and policies related to consumer
protection in their countries. However, provision
for privacy protection is not mentioned in this
set of guidelines.

The Organisation for Economic


Co-Operation and Development
(OECD)
The OECD introduced the Guidelines Governing the Protection of Privacy and Trans-border
Flows of Personal Data in September 1980. This
set of guidelines provides a foundation for member countries to review their privacy legislation.
The OECD guidelines include five parts and 22
provisions which provide directions to stakeholders regarding the collection, use, transmission,
and retention of individual information (OECD,
2001b). The OECD guidelines also seek strong
support among all stakeholders within and among
countries to adopt these guidelines, and promote

8

international co-operation. This aims to develop


a healthy online market which can facilitate the
production and distribution of goods and services globally, whereas individuals privacy can
still be protected. OECD member countries are
encouraged to employ a wide range of measures
to deal with privacy incidents such as using (i)
market-based incentives and punishment (e.g.,
trust-marks and privacy seals) to encourage compliance with standards, (ii) technical measures,
(iii) self-regulatory approach (e.g., online privacy
policies), and (iv) online privacy-related dispute
resolution (OECD, 2003).
The OECD Guidelines for Consumer Protection in the Context of Electronic Commerce
was approved by member countries in 1999. This
document covers many areas such as information
disclosure, confirmation process, and conclusion
of contracts, security, privacy, dispute resolution,
and redress as shown in Appendix 1 (OECD,
2000; Smith, 2004). The seventh OECD guidelines
for e-consumer protection aim to protect e-consumers privacy in accordance with the OECD
Guidelines Governing the Protection of Privacy
and Transborder Flow of Personal Data (1980).
This document also advises member countries
to improve measures to protect e-consumers
regarding privacy and other aspects, taking into
consideration the cross-border flow of information (OECD, 2000).
Similarly to the UN guidelines, both of these
OECD guidelines encourage private sector initiatives and call for strong and fruitful co-operation
among all stakeholders to achieve the common
objectives (OECD, 2000). These documents
emphasise consumer protection in general, and
privacy protection is only a sub-set of consumer
protection.

The European Union (EU)


The key legislation regarding e-consumer protection regarding privacy in the European Union
(EU) is the Data Protection Directive (Huffmann,

Current Measures to Protect E-Consumers Privacy in Australia

2004). This directive and self-regulatory privacy protection schemes aim to deal with spam
and other privacy incidents (Cheng, 2004). The
EU also acknowledged that sufficient consumer
protection in terms of privacy constitutes a fundamental right to consumers, and new areas of
protection had to be addressed so that the full
potential of e-retailing could be realised, and
both e-consumers and e-retailers could take full
advantage of the benefits which e-retailers could
offer (Buning, Hondius, Prins, & Vries, 2001).
However, Appendix 1 shows that EU Principles
of Consumer Protection do not cover the protection of privacy.

Asia Pacific Economic Co-Operation


(APEC)
APEC introduced APEC Voluntary Online
Consumer Protection Guidelines in 2002 (APEC
Electronic Commerce Steering Group, 2004).
These guidelines state that consumers must
receive the same level of protection no matter
which forms of commerce they engage in. The
APEC guidelines include ten provisions relating
to international cooperation, education and awareness, private sector leadership, online advertising
and marketing, online information disclosure to
consumers, confirmation process, resolution of
consumer disputes, privacy, security, and choice
of law and jurisdiction as summarised in Appendix 1 (APEC Electronic Commerce Steering
Group, 2004; Consumer Protection Commission,
n.d.). The content of these guidelines is similar to
the UN and the OECD guidelines, and generally
consistent with the eight basic consumer rights.
In brief, a number of guidelines and legislation have been introduced by the UN, the OECD,
the EU, and APEC to create a favourable online
environment, and to protect consumers in general, and their privacy in particular. Yet, these
guidelines have not been compulsory, and there
have been no clear powers or mechanisms to
enforce them. These guidelines aim to comple-

ment existing national legal frameworks for


e-consumer protection rather than overriding or
replacing them (Ha, 2006). Although the level
of compliance may vary among individuals and
organisations, depending on what roles they play,
the ultimate objective of these documents is to
create a safe and secure online environment for
all players. At the international level, there has
been no measure to deal with member countries
which have not complied with these guidelines
for whatever reasons. Non-uniform regulations
and standards in different countries have made the
protection of e-consumers privacy more difficult
and challenging due to ambiguous jurisdictional
applications and difficulties in enforcement. Thus,
different countries can adopt these guidelines to
review and develop their current privacy policy
on e-consumer protection (Harland, 1999).

Australia
In Australia, privacy issues had already been a
concern of the public and relevant authorities even
before the introduction of e-retailing. Privacy
concerns have been tackled by several measures,
including (i) legislation, (ii) guidelines, (iii) codes
of practice, (iv) initiatives by the private sector,
and (v) activities by consumer associations as
summarised in Table 2.
Source: Ha, H. (2007). Governance to Address
Consumer Protection in E-retailing (unpublished
thesis). Department of Management, Monash
University.
This section discusses the current policy
framework for protecting e-consumers privacy
and the institutional arrangement in Australia.

Legislation
Privacy protection is based on two main mechanisms: (i) general laws that regulate the collection, use, and dissemination of personal data both
by the public and private sector and (ii) different
acts (Moulinos, Iliadis, & Tsoumas, 2004).

9

Current Measures to Protect E-Consumers Privacy in Australia

Table 2. The current regulatory framework to address privacy concerns in Australia (based on information in Ha, 2007, unpublished thesis)
Policy framework

Activities

Legislation

Privacy Act 1988 (Public sector) (Cth) (Commonwealth)


Privacy Amendment (Private Sector) Act 2000 (Cth) (Commonwealth)
Telecommunications (Interception and Access) Act 1979 (Cth) (Commonwealth)
Spam Act 2003 (Cth) (Commonwealth)

Guidelines

Scams and Spam booklet and fact-sheets published by the ACCC and the CAV, respectively

Industry codes of practice

Codes by Australian Direct Marketing Association (ADMA) and Australian Internet Industry
Association (IIA)

Initiatives by the private


sector

Using privacy, digital seals, trust marks provided by TRUSTe, WebTrust, BBBOnline, BetterWeb
Using the Platform for Privacy Preferences

Activities by consumer
associations

Activities by Australian Consumers Associations (ACA) and Australia Privacy Foundation (AFP)

Table 2 shows that there are three main acts


regarding privacy: Privacy Act, Telecommunications Act, and Spam Act. Firstly, the Privacy
Act 1988 (Cth) is applicable to the public sector
regarding handling personal information (Curtis,
2005b; Jackson, 2003). This Act requires government agencies to gather personal information in a
legitimate and proper way. These agencies must
transparently disclose to the recipients about
what will be collected and how the information
will be used (Jackson, 2003; Vasiu, Warren, &
Mackay, 2002). Government bodies must ensure
that personal information is recorded accurately
and stored securely. They have to explain to individuals about the nature of the information
and why the information has been collected as
well as allow individuals to make changes if the
information is not correct (Jackson, 2003).
However, this Act has been criticised by the EU
on four main grounds. Firstly, it excludes many
organisations such as the private sector from the
coverage of the Act. Secondly, it introduces an
opt out scheme which permits organisations
to use personal data for direct marketing without obtaining the prior consent of the recipient.
Thirdly, it only covers data collected and used for
other purposes rather than the primary purpose.
An organisation collects and uses personal data
for the primary purpose for which it was col-

0

lected is not regulated under this Act (Vasiu,


Warren, & Mackay, 2002). Finally, the Act only
protects the privacy of information collected from
Australian citizens or permanent residents, not
from foreigners even if they are residing in Australia at the time of the offence (Vasiu, Warren,
& Mackay, 2002).
The Privacy Act 1988 (Cth), amended in 2000,
became the Privacy Amendment (Private Sector) Act 2000 (Cth) (Expert Group on Electronic
Commerce (Australia), 2003). This amended Act
extended to the private sector and privatised government corporations (Jackson, 2003). Schedule
3 in the Privacy Amendment (Private Sector) Act
2000 (Cth) includes the ten National Principles
for the Fair Handling of Personal Information
(NPPs) which do not apply to sub-national public
sector or government business enterprises (GBEs)
that perform substantially core government
functions (Information Law Branch, n.d.). The
10 national privacy principles are: (i) collection,
(ii) use and disclosure, (iii) data quality, (iv) data
security, (v) openness, (vi) access and correction,
(vii) identifiers, (viii) anonymity, (ix) transborder
data flows, and (x) sensitive information (Privacy
Commissioner (Australia), 2000). At the minimum
level, businesses must adhere to these 10 NPPs
(Jackson, 2003; Moghe, 2003; Privacy Commissioner (Australia), 2000). Paradoxically, the rights

Current Measures to Protect E-Consumers Privacy in Australia

of individuals to privacy is said to obstruct the


investigation of activities of cyber criminals such
as hackers, money launderers, drug traffickers,
terrorists, and fraudsters, and thus also hinder law
enforcement (Armstrong & Forde, 2003).
Nevertheless, the amended Act is not applicable
to small businesses with annual revenue less than
$3 million, the media industry or political parties and representatives (Jackson, 2003). These
organisations can choose to adopt a privacy code
established by an industry which must be approved
by the privacy commissioner (Moghe, 2003).
Small firms are not covered by the Act, and thus
many e-retailers may avoid liability for the misuse
of personal information or the abuse of customers
particulars. Neither do these national privacy acts
regulate sub-national agencies, with an exception
for the Australian Capital Territory (Clark, 1989;
Privacy Commissioner (Australia), 2003; Privacy
Commissioner (Australia), 2003, n.d.).
The current legislation may not be effective in
tackling privacy concerns due to legal loopholes.
The Privacy Act 1988 (Cth) and the Privacy
Amendment Act 2000 (Cth) are not applicable to
overseas spammers and international e-retailers
that do not buy and sell personal information. An
e-mail address without an individuals name may
not be regarded as personal information under the
Privacy Act. If a spammer gives the receivers a
chance to opt out or to comply with such request,
he/she is not under any legal liability (National
Office for the Information Economy (Australia),
2003). Furthermore, it is not easy to track the
senders due to the anonymous and global nature of
electronic messages. Although national legislation
is in place, these acts do not cover all individuals
and groups. This allows some e-retailers to avoid
protection of their customers privacy. Given the
loophole in the current privacy legislation.
Regarding institutional arrangements, the Federal Office of Privacy Commissioner administers
the Privacy Act 1988 (Cth). Each sub-national
jurisdiction also has an office of the Privacy
Commissioner. For example, the office of the

Victorian privacy commissioner (Privacy Victoria


or OVPC), established in 2001 by the Information
Privacy Act 2000 (Vic), performs various functions including advising the attorney-general and
relevant agencies on the privacy implications of
proposals for policy and legislation, revising the
Guidelines to the information privacy principles
(IPPs), investigating and enforcing breaches of
the IPPs (Privacy Commissioner (Australia),
2006a).
The second relevant Australian Act is the
Telecommunications (Interception and Access)
Act 1979 (Cth), amended in 2006, which specifies
the responsibilities of Internet service providers
in terms of the usage and disclosure of customers information. This Act encourages relevant
industries to develop codes of practice and other
internal schemes in order to address consumer
protection regarding privacy (Moghe, 2003).
Another act is the Australian Spam Act 2003
(Cth) which prohibits the sending of unsolicited
commercial electronic messages which have an
Australian link and the use of address-harvesting
software (Quo, 2004). It is unlawful to send any
spam, either by mobile phones or by e-mail, from
Australia and/or to an Australian address from
overseas (Williams, 2004). This act extends to
every external territory and also applies to matters outside Australia if there is no contrary intention (see Spam Act 2003 (Cth) ss). Yet, messages
from government agencies, registered political
parties, charities, and religious organisations are
exempted from this act. There are two conditions
of the exemption: (i) messages can be sent in
mass but these messages must communicate the
information about goods or services and (ii) the
senders must be the providers of such products
(Ha, 2007, unpublished). Messages with merely
factual content and with no commercial content
are also exempted, but these messages must contain precise identifying information (Australian
Communications and Media Authority, 2005).
Also, messages sent by e-retailers to consumers
who they have existing relationships will not



Current Measures to Protect E-Consumers Privacy in Australia

be classified as spam (Cheng, 2004). This Act


introduces an opt in principle which allows the
consent to be implicit or explicit (Vaile, 2004).
This Act does not cover non-electronic messages
and, therefore, it has a narrower application
than legislation by the UK (Cheng, 2004). On the
other hand, The Spam Act 2003 (Cth) has a wider
application than the EU Privacy Direct and the
US Can-Spam Act 2003 (US) because this Act
is applicable to spam originating from outside
Australia (Cheng, 2004).

Guidelines
The Treasury (Australia) published the Australian
Guidelines for Electronic Commerce (AGEC) in
March 2006, which has become one of the most
important documents promoting best practice
in e-retailing (The Australian Guidelines for
Electronic Commerce (AGEC) 2006 replaces the
Australian Best Practice Model (BPM) introduced
in 2000). The AGEC consists of 14 main provisions (see Table 3).

Table 3. The Australian guidelines for electronic


commerce 2006 (AGEC) (based on information
in Treasury (Australia), 2006)
No.

Guidelines

1
2
3
4
5
6
7
8
9
10
11
12
13
14

Fair business practices


Accessibility
Disability access
Advertising and marketing
Engaging with minors
Informationidentification of the business
Informationcontractual
Conclusion of contract
Privacy
Payment
Security and authentication
Internal complaint-handling
External dispute resolution
Applicable law and forum

Source: Summarised from Treasury (Australia) (2006) The


Australian Guidelines for Electronic Commerce (March
2006). Canberra: Treasury (Australia).



Guideline 9, Items 37 and 38 of the AGEC indicate that consumers privacy must be respected
and protected. The AGEC encourages small
businesses, which are not under the scope of the
Privacy Act 1988 (Cth), to comply with privacy
legislation so that they can enhance consumer
trust and confidence (Treasury (Australia), 2006).
Nevertheless, some e-retailers do not want to adopt
government guidelines (Ha, 2007).
In 2004, the Australian Competition and Consumer Commission published a Scams and Spam
booklet and other educational material to inform
consumers about the types and the adverse effect
of scams and spam, and how to avoid scams and
spam (Australian Competition and Consumer
Commission, 2004). National enforcement of
consumer protection laws is undertaken by the
ACCC. Established in 1995, it acts independently
of ministerial direction as a national statutory
body to administer the implementation of the
TPA. The main function of the ACCC is to advance the interest of Australian consumers by
promoting fair, dynamic and lawful competition
among all kinds of businesses. The ACCC has
continuously advocated consumer rights and has
conciliated many complaints related to online
purchase (Graeme, 2005; OECD, 2001a).
Other countries, such as the USA, have enacted
a number of pieces of legislation to deal with
privacy issues in general. For example, the U.S.
Can-Spam Act deals with offences relating to
spam e-mails (Cheng, 2004; OECD, 2003). The
state of California (U.S.) has enacted the AntiPhishing Act of 2005 (Business and Professions
Code sections 22948-22948.3) and Computer
Spyware 2004 (Business and Professions Code
section 22947) legislation (2004) which prohibit
phishing activities and illegal installation or
provision of software that can collect personal
information of the recipients without their knowledge and/or consent (State of California, 2007).
The Online Privacy Protection Act of 2003
(Business and Professions Code sections 2257522579), which came into effect in July 2004,

Current Measures to Protect E-Consumers Privacy in Australia

requires e-retailers to post a privacy policy on


the site and to comply with its policy (State of
California, 2007).
However, only a few countries such as Canada
have designed principles especially for consumer
protection in e-retailing. Canada introduced eight
principles of e-consumer protection in August
1999 (Working Group on Electronic Commerce
and Consumers (Canada), 1999) as shown in Appendix 2. Principles 3, 4, and 7 require e-retailers
to respect customers privacy and ensure that
e-transactions are secure. E-retailers should not
send commercial e-mails to consumers without
consumers prior consent. This principle aims to
address the first basic consumer right to safety.
In 2004, Canada developed the Canadian Code
of Practice for Consumer Protection in Electronic Commerce, based on the eight principles
of consumer protection. The Code has been
warmly welcomed by other stakeholders (Working
Group on Electronic Commerce and Consumers
(Canada), 2004).

Industry Codes of Practice


Consumers privacy is also protected by industry
codes of practice. The two best known industry
associations in Australia are the Australian Direct
Marketing Association (ADMA) (a non-profit organisation), and the Australian Internet Industry
Association (IIA) (a national industry organisation) (Australian Direct Marketing Association
(ADMA), 2005; Coonan, 2005; Quo, 2004). Both
devote their activities to advance the interests of
their members and the community as well as to
reinforce consumer confidence (Australian Direct
Marketing Association (ADMA), 2005; Internet
Industry Association (Australian), 2006). These
two organisations have established codes regarding privacy and spam, standards, and mechanisms
to ensure their members to comply with the laws
regarding consumer protection (Clarke, 1998;
Privacy Commissioner (Australia), 2006b).

Firstly, the ADMA has developed a code of


practice which established a self-regulatory
mechanism for it members to comply with privacy
policies and to avoid spam, after consultation with
the relevant authorities, the ACCC and consumer
and business groups (Quo, 2004; Scott, 2004). The
ADMA has also appointed an independent code
authority to monitor the compliance of its members
with the Code of Practice. The code authority has
the authority to sanction offenders and the penalty
may be the termination of membership.
The second set of code of practice (2000),
introduced by the Australian Internet Industry
Association (IIA), discourages its members and
code subscribers from using spam as one of
marketing tools with exception in the case of
pre-existing relationships (acquaintance spam)
(Quo, 2004). Both sets of code of practices mainly
apply to their members and code subscribers, but
non-member organisations are welcome to adopt
these codes.
Different from many countries which have
not had government-endorsed codes of conduct,
some industry associations in Australia have
lodged codes of practice to the relevant authorities
for approval (Chan, 2003; OECD, 2003). For instance, the ADMA received conditional authorisation of its direct marketing code of practice from
the ACCC (Australian Competition and Consumer
Commission, 1999), whereas the IIA Content
Regulation Code of Practice (Version 10.4) has
been registered with and administered by the
Australian Communications and Media Authority (Forder, 1999; Internet Industry Association
(Australia), 2006). Yet, insufficient information
about how these codes improve the protection of
consumers privacy has been reported.

Initiatives from the Private Sector


The private sector has adopted a self-regulatory
approach to address the protection of consumers
privacy by engaging other firms providing services
related to audits of privacy policies and privacy



Current Measures to Protect E-Consumers Privacy in Australia

seals which could assure that the participant


companies adhere to their policy (Egger, 2002).
Some examples of such services are TRUSTe or
WebTrust, BBBOnline Privacy Program, and
BetterWeb (Egger, 2002; Milloy, Fink, & Morris,
2002; Moulinos et al., 2004). The use of security
locks, security and privacy statements, and certificates can increase e-consumer confidence which
can, in turn, increase their purchase intention
(Milloy, Fink, & Morris, 2002).
However, Moulinos et al. (2004) argued that
there have been many factors affecting the efficiency and acceptability of digital seals used
by companies. These factors include the technology related to security, the brand name of the
companies issuing the digital seal, and the legal
framework surrounding online privacy. Yet, these
privacy seals do not offer any legal protection
because they only measure the extent to which
e-retailers conform to their promises. In addition, some seals are recognised in one country
but not in other countries, others may not have
any value in the same country (Egger, 2002). In
many cases, consumers place their trust more in a
seal of their local Consumers Association than
a seal endorsed by other organisations (Egger,
2002). Nevertheless, less than half of respondents
in a survey conducted by Moores (2005) could
recognise a privacy seal, and even those who
could recognise a private seal might not know
how to identify whether a seal was genuine or
counterfeit.
New technologies developed by the private sector also offer alternative solutions. The Platform
for Privacy Preferences (P3P) allows users to
compare a Web sites privacy policies with their
own preferences (Yianakos, 2002). This enables
users to select the websites which match their
expectation and thus they should only do business
with such Web sites. The P3P is supplementary to
legislative and self-regulatory mechanisms to



help in the review and enforcement of Web site


policies (Moghe, 2003). Other means to counter
privacy incidents include the use of digital cash
(cyber cash or Internet cash which does not require
users to reveal their personal and financial information), authentication, filters, and anti-phishing
and anti-pharming systems (Milloy, Fink, &
Morris, 2002; OECD, 2006).

Activities by Consumer Associations


The Australian Privacy Foundation (APF), established in 1987, is a non-government voluntary
organisation (Australian Privacy Foundation,
2006). The APF claims that its main objective is to
protect privacy rights of Australians via different
means such as provision of education to increase
public awareness, and advocacy of new legislation
and codes of practice regarding privacy (Australian Privacy Foundation, 2005). One of the main
activities of the APF is to organise campaigns
against privacy threats caused by the adoption
of the ID card scheme (Davies, 2004).
The Australian Consumers Association
(ACA) is a non-profit and key consumer association (Brown, 1996). It advocates consumer
rights in both the online and offline markets. It
provides the public with advice on different kinds
of goods and services and represents consumers
(Federal Bureau of Consumer Affairs (Australia),
1995). The ACA has also advocated a review of
privacy legislation and other legislations regarding consumer protection (Australian Consumers
Association, 2004).
Overall, although several measures described
above have been used to address e-consumer protection regarding privacy, insufficient information
about formal evaluation of such measures has
been published. The following section provides
statistics relating to privacy

Current Measures to Protect E-Consumers Privacy in Australia

mAIn thrust: the current


stAte of e-consumer
protectIon regArdIng
prIvAcy
privacy Incidents
International Level
About 62 billion spam e-mails are sent everyday
worldwide (Sullivan, 2007). Spam accounted for
93% of all e-mail traffic which was monitored by
Postini, an Internet security firm, in February
2007 (Haymarket Media, 2007).
A study by Teo (2002) in 2002 reported that
43.9% of Singaporean respondents showed concerns about privacy when they shopped online,
whereas Udos study (2001) reported 55.1% of U.S.
respondents ranked privacy concerns number one.
According to PC Magazine (2005), 20% of the
respondents in a recent survey were victims of
identity theft, and 64% of U.S. consumers would
not buy online because of concern over personal
information.
Consumers International and the Trans Atlantic Consumer Dialogue (TACD) conducted
an international online survey in 2004 in which
21,000 people from more than 36 countries were
asked about spam. Forty two percent of them replied that spam e-mails accounted for more than
50% of their e-mails, whereas 84% of respondents
welcomed the prohibition of all unsolicited e-mails
(Consumers International, 2004).

National Level
In Australia, 62% of Australian respondents in
a survey conducted by Roy Morgan Research
in 2004, worried about privacy concerns (Roy
Morgan Research, 2004). Australia loses more
than $1.1 billion per year for identity fraud (The
Age, 2006). The National Australia Bank loses
$1 million per month due to Internet fraud, and

this amount is expected to reach $30 million per


year by 2008 (Lekakis, 2005).
The findings from surveys conducted by Coalition against Unsolicited Bulk E-mail (Australia)
(2002) revealed that the average number of spam
per e-mail address was 118 and 140 in 1999 and
2001 respectively, an increase of 18.6% in Australia. Spam costs Australia about $2 billion a
year (Quo, 2004).
The highest number of computer-related offences reported to Victoria Police was in 19981999 and the number of offences decreased from
2000-2001 to 2002-2003. The most common
offences reported to Victoria Police related to
illegal access to computer systems which can be
classified as security and/or privacy incidents.
Official statistics might not include all the cases
occurring because the police could only deal
with matters that were reported to them (Drugs
and Crime Prevention Committee (Parliament of
Victoria), 2004). In addition, many consumers
did not bother to complain as the cost of redress
might outweigh the claim value (Donahey, 2003;
Patel & Lindley, 2001). Most e-consumers have
been very reluctant to report their problems. This
is evident by a low rate of reporting attacks on
computers to police since up to 65% of victims
of e-attacks did not report to law enforcement in
Australia in 2005 (Krone, 2006).
Generally, the statistics about complaints are
fragmented. Some e-consumers have lodged a
complaint with the ACCC or Consumer Affairs
Victoria, whereas others might report their cases
to the police. However, these figures can provide
an overview picture of the current state of e-consumer protection regarding privacy in Australia,
and worldwide, that is, the number of complaints
relating to online shopping has increased.

consumers Attitudes towards


online privacy Issues
A recent study by Ha (2007, unpublished) provides
an insight into the current state of e-consumer



Current Measures to Protect E-Consumers Privacy in Australia

protection regrading privacy. According to this


study, the majority of the respondents (80%) were
aware of different devices and means used on the
Internet, such as cookies, computer bugs, viruses,
spyware, and adware, which could reveal their
personal data. Nearly three-quarters of them (73%)
were worried about the amount of spam they received. This is consistent with a survey conducted
by the NOIE in 2000 which reported that 77% of
the respondents would not shop online because
of privacy concerns (Consumer Affairs Victoria,
2004). The Australian National Office for the Information Economy (NOIE) was replaced by the
Australian Government Information Management
Office (AGIMO) in 2004 (Williams, 2004).
Only 49% of these respondents showed an
agreement on the sufficiency, accuracy, and
ease of locating information about privacy on
commercial Web sites. As discussed previously,
the current Privacy Act (Australia) does not apply to small businesses with the annual revenue
less than A$3 million (Privacy Commissioner
(Australia), n.d.). Thus, small companies are not
required to post any information about privacy.
Most of e-retailers are small and new, with limited
security skills and budget (Centeno, 2002), and
thus this may explain for the low percentage of
the respondents who agreed on the adequacy and
precise of information about privacy. In this case,
lack of regulation relating to privacy applied to
small businesses is one of the weaknesses of the
current policy framework for consumer protection in e-retailing.
Finally, only 54% of the respondents agreed
that they knew how to handle issues relating to
privacy. This means nearly half of them might not
know how to deal with online privacy incidents
(Ha, 2007, unpublished).

how does a Well-known e-retailer


protect the privacy of Its
customers?
This section illustrates how a well-known eretailer provides protection to its customers by


discussing and evaluating Dells policies and


practices regarding privacy. Dell has been chosen
because of its unique direct e-business model and
its success as a computer e-retailer operating in
many countries, including Australia (Bangeman,
2006).

Policies and Practices


Dell has posted privacy policy on its website, as
required by the privacy legislation in Australia.
Dell has also employed several self-regulatory
measures to protect the privacy of its customers.
One of Dells activities is the launch of an online
essential portal which aims to educate consumers to protect their own privacy. This portal has
been developed by Dell in conjunction with the
National Consumers League (National Consumers League (USA), n.d.). Dell has also worked with
other organisations to launch several campaigns
to enhance public awareness of issues associated
with online shopping. For instance, Dell and the
Internet Education Foundation (www.neted.org)
jointly launched the consumer spyware initiative (CSI) public awareness campaign in 2004
(Claburn, 2004).
Dells Web site provides information about
what types of personal information will be collected, and how customers particulars will be used
and stored (Dell Inc. Australia, 2004). Dells privacy policy assures customers that their personal
data will not be revealed to third parties without
their prior written consent (Dell Inc. Australia,
2005). According to Dells policy, customers
have the right to opt out from the marketing list.
Dell also requests its employees to protect the
confidentiality of information about the relationship between Dell and its customers and other
stakeholders. Consumers can update or correct
their personal information online or by contacting
Dell (Dell Inc. Australia, 2004). Customers can
find a physical address and an online feedback
platform on Dells Web site, and they can make
queries to Dell about privacy issues.

Current Measures to Protect E-Consumers Privacy in Australia

Dell is a member of the U.S.-based Word


of Mouth Marketing Association (WOMMA),
and was the first corporate subscriber to have
publicly committed itself to the code of ethics
introduced by WOMMA in 2006 (Word of Mouth
Marketing Association, 2006). In addition, Dell
is a member of the Australian Internet Industry
Association (IIA), the Electronic Industry Code
of Conduct (USA), and the BBB OnLine Privacy
Program (BBB Online, 2003). Dells employees
must provide accurate and sufficient information
to its customers, and protect the privacy of both
internal and external customers. Dells employees
who do not act in accordance with Dells policy
are liable to discipline and/or civil and/or criminal
penalties (Dell Inc. Australia, 2007).

Evaluation
Dell does not fully comply with the regulation in
that Dell does not provide sufficient information
regarding identifiers, anonymity, cross-border
data flows, and sensitive information as required
by the NPPs (see the second section). This implies
that Dell has not fully respected consumers safety
(the first basic consumer right) regarding privacy
(see the second section). Also, the contact number
and the physical address which consumers can
communicate any privacy concern to Dell are
in Singapore, and no name of any individual in
charge of privacy at Dell is indicated. This shows
that Dell fails to meet the accepted standard of
information disclosure (Clayton, 2000, Ha, 2007,
unpublished).
Finally, there has been insufficient formal
evaluation of how Dells codes of conduct improve
the level of privacy protection. Also, most of Dells
collaboration with industry and consumer associations has taken place in the USA, not in Australia.
Furthermore, insufficient information about how
Dell has worked with industry and consumer associations in Australia has been reported.
Generally, companies registered in Australia, except for small businesses, have to comply

with the current privacy legislation. In addition,


guidelines and industry codes of practice are
only advisory, not compulsory, whilst the activities
of industry and consumer associations are limited. Thus, the protection of consumers privacy
depends much on the willingness and ability of
companies to practice social corporate responsibility (CSR) and adopt self-regulatory measures
to protect their customers privacy.
The mentioned data demonstrate that the current measures to address privacy issues associated with online shopping may not be effective
without the willingness and ability of e-retailers
to protect consumers personal data. The case
study also shows that even a well-known e-retailer
does not fully comply with the prevailing privacy
legislation to protect its customers privacy much
less voluntarily go beyond the minimum requirements of the laws.

polIcy ImplIcAtIons for


e-consumer protectIon In
terms of prIvAcy
There are a number of policy implications for the
protection of e-consumers regarding privacy.
The first implication refers to the coverage of
the current national legislation regarding small
businesses. Most e-retailers are small and they
do not post sufficient information about privacy
on their Web sites. The failure to provide sufficient information regarding privacy is difficult
to reconcile with good standards of consumer
protection. Thus, the current privacy legislation
will be reviewed in order to widen its coverage
and ensure all e-retailers receive equal treatment
in terms of compliance with privacy legislation
(Ruddock, 2006).
The second implication refers to the harmonisation of regulations, standards, and guidelines.
Most international guidelines call for voluntary
adoption of good business practice by e-retailers
to protect e-consumers privacy given the special



Current Measures to Protect E-Consumers Privacy in Australia

nature of the online market (Curtis, 2005a; Dudley,


2002; Gilliams, 2003; Lahey, 2005). However, econsumers perceive that self-regulation means
no standards, that is, e-consumers will receive
different levels of protection by different e-retailers. Also, different laws and standards across jurisdictions regarding privacy may adversely affect
the compliance of e-retailers and the effectiveness
of law enforcement. Thus, uniform regulations,
guidelines, and CSR can contribute to addressing
jurisdiction concerns and enabling consumers to
receive equal level of privacy protection.
The third implication refers to the review of
enforcement mechanisms of both legislation and
guidelines regarding privacy. The AGEC and
industry code of practice are only advisory, not
mandatory. Such guidelines and codes could not
be effective unless backed by legislation and a
compliance regime; there is little incentive for
e-retailers to voluntarily comply with guidelines
and/or codes of conduct (Ha, 2007, unpublished;
Mayer, 2002).
The fourth implication refers to consumer
education. Given the lack of awareness of consumers regarding identity theft and spyware,
educational programs provided by government
agencies, industry and consumer associations
would increase the awareness of e-consumers of
the importance of keeping personal information
confidential. Such education programs could aim
to equip the public with knowledge about how to
recognise online threats and risks as well as how
to avoid online privacy incidents. In addition, some
e-retailers are willing to comply with regulations
and guidelines, but they do not have sufficient
means to do so (Ha, 2007, unpublished). Such eretailers may not know what information should
be posted, and how much information would be
sufficient. Thus, e-retailer education would be
helpful to such e-retailers. Industry associations
could be the most appropriate candidates to provide education to e-retailers.
Another implication refers to the use of a
combination of legal, human, and technical mea-

8

sures to address privacy issues more effectively,


as Yianakos (2002) comments that privacy is a
problem of both technology and behaviour. Thus,
a combination of legal framework (e.g., legislation,
guidelines, and codes of practice), technological
measures (e.g., digital seal and certificate), and
human behaviour related measures (e.g., education) is desirable to improve the protection of
consumers privacy in the online market.
The final implication refers to the effect of
acts enacted by other countries which have great
impacts on global business and privacy protection.
For example, the U.S. Fair Information Practice
Principles lays the foundation for the formulation
of privacy laws by supra-national organisations
and many countries (State of California, 2007).
The Safe Harbor Privacy Framework, which
was approved by the EU in 2000, facilitates the
compliance of U.S. companies with the European
Directive on Data Protection (1998) and other
relevant European privacy laws. Some other
countries which have low benchmark for adequacy of privacy policy may have to improve
their standards regarding privacy to meet the EUs
standards in order to prevent any interruptions in
commercial transactions when Australian companies do business in EU countries (Allens Arthur
Robinson, 2007; Greenleaf, 2000b, p. 1). However,
the Attorney General (Australia) argued that Australian privacy legislation has gone significantly
further than the US Safe Harbor Agreement
(Allens Arthur Robinson, 2007), although there
have still been loopholes in the amended privacy
act (Greenleaf, 2000a). The impacts of privacy
acts enacted by supra-national organisations and
other countries on international trade should be
a subject of further research.

conclusIon
This chapter has examined privacy issues associated with consumer protection in the online
market, including data security, spam/spim, and

Current Measures to Protect E-Consumers Privacy in Australia

spyware, and the policy framework for addressing online privacy incidents at both international
and national levels.
Although international and national policies
to address privacy issues are in place, the effectiveness of the current policy framework has not
been formally evaluated. However, the number of
online privacy incidents has steadily increased,
and most consumers do not know how to deal with
such incidents. The findings reveal that a single
organisation or a single measure is not adequate
to address the complicated and challenging issues
associated with online privacy. A joint effort of
all stakeholders and adoption of a combination of
different measures would be desirable to protect
e-consumers privacy more effectively.
Given the lack of studies on online consumer
protection in terms of privacy, further research
on online privacy will certainly contribute to
the development of a theoretical framework and
practical approaches to solving stagnant problems
with e-consumer protection regarding privacy.

The exposure of loopholes in the current privacy legislation has led the Australian government
to review it (Ruddock, 2006). This confirms the
potential for more research on the extent to which
new legislation could deter undesirable behaviour
relating to privacy.
The cross-border and transient nature of e-retailing justifies more research on how legislation
and guidelines could be adopted at a supra-national
level to more effectively prevent the abuse of or
illegal use of e-customers particulars.
In addition, the limited formal evaluation of
privacy protection measures in e-retailing suggests they should be further investigated.
Finally, security and privacy issues are interrelated because lack of security measures may
lead to the unwanted disclosure of customers
personal and financial information. Addressing
any one of these issues separately is insufficient to
ensure consumers interests to be fully protected.
Thus, they must be investigated as an integrated
problem and addressed simultaneously.

future reseArch dIrectIons

references

Further research could usefully focus on the


motivations and behaviour of consumers in
protecting themselves against privacy incidents
in the online marketplace and on technical measures which can contribute to addressing online
privacy concerns.
Privacy issues associated with e-retailing are
caused by a combination of legality, technology,
and human behaviour which require different
measures by different groups of stakeholders to
address them effectively. Thus, future research
could also focus on whether cooperation among
all stakeholders at all levels (international, national, and sub-national levels) could address
privacy online incidents more effectively and on
whether greater consumer vigilance and self-help
efforts could contribute to addressing privacy
concerns.

Akenji, L. (2004). The eight basic consumer rights.


Retrieved November 8, 2006, from http://www.
tudatosvasarlo.hu/english/article/print/254
Allens Arthur Robinson. (2007). International
data flow. Retrieved October 2, 2007, from www.
aar.com.au/privacy/over/data.htm
APEC Electronic Commerce Steering Group.
(2004). APEC voluntary online consumer protection guidelines. Retrieved March 1, 2005, from
http://www.export.gov/apececommerce/cp/guidelines.htm
Armstrong, H. L., & Forde, P. G. (2003). Internet
anonymity practices in computer crime. Information Management & Computer Security, 11(5),
209-215.

9

Current Measures to Protect E-Consumers Privacy in Australia

Australian Communications and Media Authority. (2005). Consumer information. Anti


spamfighting spam in Australia: Consumer information. Retrieved July 22, 2005,
from http://www.acma.gov.au/ACMAINTER.2163012:STANDARD:848603301:pc=PC_
1965#anti%20spam%20law
Australian Competition and Consumer Commission. (1999). ACCC conditionally authorises
ADMA code of practice. Retrieved March 27,
2007, from http://www.accc.gov.au/content/index.
phtml/itemId/322914/fromItemId/621589
Australian Competition and Consumer Commission. (2003). Review of building consumer
sovereignty in electronic commerce (best practice
model). Retrieved November 11, 2006, from http://
www.ecommerce.treasury.gov.au/bpmreview/
content/_download/submissions/accc.rtf
Australian Competition and Consumer Commission. (2004). Annual report 2003-2004fostering competitive, efficient, fair and informed
Australian markets. Canberra, ACT: Australian
Competition and Consumer Commission.
Australian Consumers Association. (2004).
Submission to the review of the private sector
provisions of the privacy act 1988 (Cth) (the
privacy act) Sydney, NSW: Office of the Privacy
Commissioner (Australia).
Australian Direct Marketing Association
(ADMA). (2005). ADMA profile. Retrieved
August 17, 2005, from http://www.adma.com.
au/asp/index.asp?pgid=2026
Australian Federal Police (AFP). (2007). Internet
fraud. Retrieved March 16, 2007, from http://www.
afp.gov.au/national/e-crime/internet_scams
Australian Institute of Criminology. (2006). More
Malwareadware, spyware, spam and spim. High
Tech Crime Brief, 1(2006), 1-2.
Australian Privacy Foundation. (2005). Rule
3objectives and purposes. Retrieved April 2,

0

2007, from http://www.privacy.org.au/About/Objectives.html


Australian Privacy Foundation. (2006a). Identity
checks for pre-paid mobile phones. Retrieved
April 2, 2007, from http://www.acma.gov.au/
webwr/_assets/main/lib100696/apf.pdf
Australian Privacy Foundation. (2006b). International instruments relating to privacy law.
Retrieved February 23, 2007, from http://www.
privacy.org.au/Resources/PLawsIntl.html
Bangeman, E. (2006). Dell growth rate slips
behind market. Retrieved July 20, 2006, from
http://arstechnica.com/news.ars/post/200604206640.html
BBB Online. (2003). Dispute resolution. Retrieved
July 19, 2006, from http://www.bbbonline.org/
privacy/dr.asp
Brown, J. (1996). Australia and the modern consumer movement. A History of the Australian
Consumer Movement (pp. 1-6). Braddon, ACT:
Consumers Federation of Australia.
Buning, M. D. C., Hondius, E., Prins, C., & Vries, M. D. (2001). Consumer@Protection.EU.
An analysis of European consumer legislation
information society. Journal of Consumer Policy,
24(3/4), 287-338.
Centeno, C. (2002). Building security and consumer trust in internet payments: The potential
of soft measure. Seville, Spain: Institute for
Prospective Technological Studies.
Chan, P. (2003, September). The practical effect
of privacy laws on the global business and global
consumer. Paper presented at the 25th International Conference of Data Protection and Privacy
Commissioners, Sydney, NSW.
Cheng, T. S. L. (2004). Spam regulation: Recent
international attempts to can spam. Computer
Law & Security Report, 20(6), 472-479.

Current Measures to Protect E-Consumers Privacy in Australia

Choice. (2006). The eight basic consumer rights.


Retrieved November 5, 2006, from http://www.
choice.com.au/viewArticle.aspx?id=100736&cat
Id=100528&tid=100008&p=1&title=The+eight+
basic+consumer+rights

Consumer Protection Commission, E. Y. T.


(undated). E-commerce: APEC voluntary online
consumer protection guidelines. Retrieved April
3, 2007, from http://www.cpc.gov.tw/en/index.
asp?pagenumber=25

Claburn, T. (2004). Dell believes education is


best way to fight spyware. InformationWeek,
October 20. Retrieved September 30, from
http://www.informationweek.com/showArticle.
jhtml;jsessionid=GHVMAU4IX1LXGQSNDLO
SKHSCJUNN2JVN?articleID=50900097&quer
yText=Dell+Believes+Education+Is+Best+Way+
To+Fight+Spyware

Consumers International. (2001). Should I buy?


Shopping online 2001: An international comparative study of electronic commerce. London:
Consumers International.

Clark, R. (1989). The Australian privacy act 1988


as an implementation of the OECD data protection guidelines. Retrieved March 27, 2007, from
http://www.anu.edu.au/people/Roger.Clarke/DV/
PActOECD.html
Clarke, R. (1998). Direct marketing and privacy.
Retrieved March 24, 2007, from http://www.anu.
edu.au/people/Roger.Clarke/DV/DirectMkting.
html
Clayton, G. (2000). Privacy evaluation: Dell.
Retrieved July 20, 2006, from http://www.informationweek.com/privacy/dell.htm
Coalition Against Unsolicited Bulk Email (Australia). (2002). Spam volume statistics. Retrieved
June 2, 2007, from http://www.caube.org.au/spamstats.html
Consumer Affairs Victoria. (2003). Commonwealth website guidelines ignored. Retrieved
November 16, 2006, from http://www.consumer.
vic.gov.au/CA256F2B00231FE5/Print/C3DCD
CFFC3DBD8EECA256F54000412C4?OpenD
ocument
Consumer Affairs Victoria. (2004). Online shopping and consumer protection. Discussion paper,
Melbourne, Victoria: Standing Committee of
Officials of Consumer AffairsE-commerce
Working Party, Consumer Affairs Victoria.

Consumers International. (2004). Annual report


2004. London: Consumers International.
Consumers International. (2006). World consumer
rights day. Retrieved November 7, 2006, from
http://www.consumersinternational.org/Templates/Internal.asp?NodeID=95043&int1stParent
NodeID=89651&int2ndParentNodeID=90145
Coonan, H. (2005, February). 10 years on. 10 years
strong. The internet in Australia. Paper presented
at the 2005 Internet Industry Association Annual
Dinner, Sydney, NSW.
Crime and Misconduct Commission Queensland.
(2004). Cyber trapsan overview of crime,
misconduct and security risks in the cyber environment. Queensland: Crime and Misconduct
Commission.
Curtis, K. (2005a, September). The importance of
self-regulation in the implementation of data protection principles: the Australian private sector
experience. Paper presented at the 27th International Conference of Data Protection and Privacy
Commissioners, Montreux, Switzerland.
Curtis, K. (2005b, March). Privacy in practice.
Paper presented at the Centre for Continuing Legal
Education, University of NSW, Sydney.
Cyota. (2005). Cyota online fraud survey. Retrieved April 7, 2006, from http://www.cyota.
com/press-releases.asp?id=78
Davies, S. (2004, February). The loose cannon: An
overview of campaigns of opposition to national



Current Measures to Protect E-Consumers Privacy in Australia

identity card proposals. Paper presented at the


Unisys seminar: e-ID: Securing the mobility
of citizens and commerce in a Greater Europe,
Nice.
Dell Inc. Australia. (2004). Dells privacy policy.
Retrieved March 2, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/
en/privacy?c=au&l=en&s=gen
Dell Inc. Australia. (2005). Dells online policies.
Retrieved March 28, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/
en/au/termsau?c=au&l=en&s=gen
Dell Inc. Australia. (2007). Online communication
policy. Retrieved June 5, 2007, from http://www.
dell.com/content/topics/global.aspx/corp/governance/en/online_comm?c=us&l=en&s=corp
Department of Economic and Social Affairs (UN).
(2003). United nations guidelines for consumer
protection (as expanded in 1999). New York:
United Nations.
Donahey, M. S. (2003). The UDRP model applied
to online consumer transactions. Journal of International Arbitration, 20(5), 475-491.
Drugs and Crime Prevention Committee (Parliament of Victoria). (2004). Inquiry into fraud and
electronic commercefinal report. Melbourne,
Victoria: Parliament of Victoria.
Egger, F. N. (2002). Consumer trust in e-commerce: From psychology to interaction design. In J.
E. J. Prins, P. M. A. Ribbers, H. C. A. van Tilborg,
A. F. L. Veth, & J. G. L. van der Wees (Eds.), Trust
in electronic commerce. The role of trust forms a
legal, an organizational and a technical point of
view (pp. 11-43). The Hagues/London/New York:
Kluwer Law International.
European Commission. (2005). Consumer protection in the European Union: Ten basic principles.
Brussels: European Commission.



Expert Group on Electronic Commerce (Australia). (2003). Review of building consumer sovereignty in electronic commerce: A best practice
model for business. Canberra, ACT: Treasury
(Australia).
Federal Bureau of Consumer Affairs (Australia).
(1993). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Federal Bureau of Consumer Affairs (Australia).
(1995). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Forder, J. (1999). The IIA code of practice: Coregulation of the internet starts here. Retrieved
March 31, 2007, from http://epublications.bond.
edu.au/law pubs/38
Fraud costing Australia $1.1b a Year. (2006, April
7). The Age.
Gilliams, H. (2003, October). Self regulation by
liberal professions and the competition rules.
Paper presented at the Regulation of Professional
Services Conference organized by the European
Commission, Brussels.
Grabner-Kraeuter, S. (2002). The role of consumers trust in online-shopping. Journal of Business
Ethics, 39(1-2), 43-50.
Grabosky, P., Smith, R. G., & Dempsey, G. (2001).
Electronic theft: Unlawful acquisition in cyberspace. Cambridge and New York: Cambridge
University Press.
Graeme, S. (2005). 30 years of protecting consumers and promoting competition. Keeping Good
Companies, 57(1 (Feb)), 38041.
Greenleaf, G. (2000a). Private sector bill amendments ignore EU problems. Retrieved October
2007, from http://www.austlii.edu.au/au/journals/
PLPR/2000/30.html

Current Measures to Protect E-Consumers Privacy in Australia

Greenleaf, G. (2000b). Safe harbors low benchmark for adequacy: EU sells out privacy for
U.S.$. Retrieved October 2, 2007, from www.
austlii.edu.au/au/journals/PLPR/2000/32.html
Ha, H. (2005, October). Consumer protection in
business-to-consumer e-commerce in Victoria,
Australia. Paper presented at the CACS 2005
Oceania Conference, Perth, WA.
Ha, H. (2006, September-October). Security issues
and consumer protection in business-to-consumer
e-commerce in Australia. Paper presented at
the 2nd Australasian Business and Behavioural
Sciences Association International Conference:
Industry, Market, and Regions, Adelaide.
Ha, H. (2007). Governance to address consumer
protection in e-retailing. Unpublished doctoral
thesis, Monash University, Melbourne, Victoria.
Harland, D. (1987). The United Nations guidelines
for consumer protection. Journal of Consumer
Policy, 10(2), 245-266.
Harland, D. (1999). The consumer in the globalised
information societythe impact of the international organisations. Australian Competition and
Consumer Law Journal, 7(1999), 23.
Haymarket Media. (2007). Spam hits records levels in February. Retrieved March
20, 2007, from http://www.crn.com.au/story.
aspx?CIID=75798&r=rss
Huffmann, H. (2004). Consumer protection in
e-commerce. University of Cape Town, Cape
Town.
Information Law Branch. (undated). Information
paper on the introduction of the privacy amendment (private sector) bill 2000. Barton, ACT:
Attorney Generals Department (Australia).
Internet Industry Association (Australia). (2006a).
Content code. Retrieved March 31, 2007, from
http://www.iia.net.au/index.php?option=com_c

ontent&task=category&sectionid=3&id=19&It
emid=33
Internet Industry Association (Australian).
(2006b). About the IIA. Retrieved March
24, 2007, from http://www.iia.net.au/index.
php?option=com_content&task=section&id=7
&Itemid=38
Jackson, M. (2003). Internet privacy. Telecommunications Journal of Australia, 53(2), 21-31.
James, M. L., & Murray, B. E. (2003). Computer
crime and compromised commerce (Research
Note No. 6). Canberra, ACT: Department of the
Parliamentary Library.
Kaufman, J. H., Edlund, S., Ford, D. A., & Powers,
C. (2005). The social contract core. Electronic
Commerce Research, 5(1), 141-165.
Kehoe, C., Pitkow, J., Sutton, K., Aggarwal, G., &
Rogers, J. D. (1999). Results of GVUs tenth world
wide web user survey. Retrieved November 16,
2006, from http://www.gvu.gatech.edu/user_surveys/survey-1998-10/tenthreport.html
Kim, S., Williams, R., & Lee, Y. (2003). Attitude
toward online shopping and retail website quality: A comparison of US and Korean consumers.
Journal of International Consumer Marketing,
16(1), 89-111.
Krone, T. (2005). Concepts and terms. Canberra:
The Australian High Tech Crime Centre.
Krone, T. (2006). Gaps in cyberspace can leave
us vulnerable. Platypus Magazine, 90 (March
2006), 31-36.
Lahey, K. (2005, August 30). Red tape on a roll...
and it must stop. The Age, 8.
Lawson, P., & Lawford, J. (2003). Identity theft:
The need for better consumer protection. Ottawa:
The Public Interest Advocacy Centre.
Lekakis, G. (2005). Computer crime: The
Australian facts and figures. Retrieved April



Current Measures to Protect E-Consumers Privacy in Australia

7, 2007, from http://www.crime-research.org/


news/19.07.2005/1373/
Lynch, E. (1997). Protecting consumers in the
cybermarket. OECD Observer, 208(Oct/Nov),
11-15.
MacRae, P. (2003). Avoiding eternal spamnation.
Chatswood, NSW: Australian Telecommunications Users Group Limited (ATUG).
Majoras, D. P., Swindle, O., Leary, T. B., Harbour,
P. J., & Leibowitz, J. (2005). The US SAFE WEB
Act: Protecting consumers from spam, spyware,
and fraud. A legislative recommendation to
congress. Washington D.C.: Federal Trade Commission (US).
Mansoorian, A. (2006). Measuring factors for
increasing trust of people in e-transactions. Lulea,
Sweden: Lulea University of Technology.
Mansor, P. (2003, April-May). Consumer interests
in global standards. Paper presented at the Global
Standards Collaboration (GSC) 8User Working
Group Session, Ottawa.
Marshall, A. M., & Tompsett, B. C. (2005). Identity theft in an online world. Computer Law &
Security Report, 21, 128-137.
Mayer, R. N. (2002). Shopping from a list: International studies of consumer online experiences.
Journal of Consumer Affairs, 36(1), 115-126.
Metz, C. (2005, August 23). Identity theft is out
of control. PC Magazine, 87-88.
Milloy, M., Fink, D., & Morris, R. (2002, June).
Modelling online security and privacy to increase
consumer purchasing intent. Paper presented at
the Informing Science + IT Education Conference, Ireland.
Milne, G. R. (2003). How well do consumer protect themselves from identity theft? The Journal
of Consumer Affairs, 37(2), 388-402.



Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), -232217.
Moghe, V. (2003). Privacy managementa new
era in the Australian business environment. Information Management & Computer Security,
11(2), 60-66.
Moores, T. (2005). Do consumers understand the
role of privacy seals in e-commerce? Communication of the ACM, 48(3), 86-91.
Moulinos, K., Iliadis, J., & Tsoumas, V. (2004).
Towards secure sealing of privacy policies. Information Management & Computer Security,
12(4), 350-361.
Muris, T. J. M. (2002, October). The interface of
competition and consumer protection. Paper presented at the Fordham Corporate Law Institutes
Twenty-Ninth Annual Conference on International Antitrust Law and Policy, New York.
National Consumers League (USA). (n.d.). Essentials for online privacy. Retrieved June 25,
2007, from http://www.nclnet.org/technology/essentials/privacy.html
National Office for the Information Economy
(Australia). (2003). Spamfinal report of the
NOIE review of the spam problem and how it
can be countered. Canberra, ACT: Department
of Communication, Information Technology and
the Arts.
North American Consumer Project on Electronic
Commerce (NACPEC). (2006). Internet consumer
protection policy issues. Geneva: The Internet
Governance Forum (IGF).
NSW Office of Fair Trading. (2003). International
consumer rights: The world view on international
consumer rights. Retrieved November 15, 2006,
from http://www.fairtrading.nsw.gov.au/shopping/shoppingtips/internationalconsumerrights.
html

Current Measures to Protect E-Consumers Privacy in Australia

OECD. (2000). OECD Guidelines for consumer


protection in the context of electronic commerce.
Paris: OECD.
OECD. (2001a). Australiaannual report on
consumer policy development 2001. Retrieved
March 11, 2005, from http://www.oecd.org/dataoecd/33/45/1955404.pdf
OECD. (2001b). OECD guidelines on the protection of privacy and transborder flows of personal
data. Paris: OECD.
OECD. (2003). Report on compliance with, and
enforcement of, privacy protection online. Paris:
OECD.
OECD. (2006). Protecting consumers from cyberfraud. Paris: OECD.
Patel, A., & Lindley, A. (2001). Resolving online
disputes: Not worth the bother? Consumer Policy
Review, 11(1 (Jan/Feb)), 2-5.
PC Magazine. (2005). The perils of online shopping. PC Magazine, 24(14), 23.
Petty, R. D., & Hamilton, J. (2004). Seeking a
single policy for contractual fairness to consumers: A comparison of U.S. and E.U efforts. The
Journal of Consumer Affairs, 38(1), 146-166.
Privacy Commissioner (Australia), O. o. (2000).
National privacy principles (extracted from the
privacy amendment (private sector) act 2000).
Retrieved July 19, 2006, from http://www.privacy.
gov.au/publications/npps01.html
Privacy Commissioner (Australia), O. o. (2003).
National privacy principle 7identifiers in the
health sector. Sydney: Privacy Commissioner,
Office of (Australia).
Privacy Commissioner (Australia), O. o. (2006a).
Annual report 2005-2006. Melbourne, Victoria:
Office of the Victorian Privacy Commissioner.
Privacy Commissioner (Australia), O. o. (2006b).
Industry standard for the making of telemarket-

ing calls. Sydney, NSW: Office of the Privacy


Commissioner (Australia).
Privacy Commissioner (Australia), O. o. (n.d.).
State & territory privacy laws. Retrieved August
2, 2005, from http://www.privacy.gov.au/privacy_rights/laws/index.html#2
Quirk, P., & Forder, J. (2003). Electronic commerce and the law (2nd ed.). Queensland: John
Wiley & Sons Australia, Ltd.
Quo, S. (2004). Spam: Private and legislative
responses to unsolicited electronic mail in Australia and the United States. ELawMurchdoch
University, 11(1).
Round, D. K., & Tustin, J. (2004, September).
Consumers as international traders: Some potential information issues for consumer protection
regulators. Paper presented at the International
Trade Law Conference, Attorney-Generals Department, Canberra, ACT.
Roy Morgan Research. (2004). Community attitudes towards privacy 2004. Sydney, NSW:
The Office of the Federal Privacy Commissioner
(Australia).
Ruddock, P. (2006). Australian law reform commission to review privacy act. Retrieved June 7,
2007, from hhttp://www.ag.gov.au/agd/WWW/
MinisterRuddockHome.nsf/Page/Media_Releases_2006_First_Quarter_31_January_2006__Australian_Law_Reform_Commission_to_review_Privacy_Act_-_0062006#
Saarenp, T., & Tiainen, T. (2003). Consumers
and e-commerce in information system studies.
In M. Hannula, A-M. Jrvelin, & M. Sepp (Eds.),
Frontiers of e-business research: 2003 (pp. 62-76).
Tampere: Tampere University of Technology and
University of Tampere.
Scott, C. (2004). Regulatory innovation and the
online consumer. Law & Policy, 26(3-4), 477506.



Current Measures to Protect E-Consumers Privacy in Australia

Scottish Consumer Council. (2001). E-commerce


and consumer protection: Consumersreal
needs in a virtual world. Glasgow: Scottish
Consumer Council.
Singh, B. (2002). Consumer education on consumer rights and responsibilities, code of conduct
for ethical business, importance of product labelling. Kualar Lumpur: Consumers International.
Smith, L. (2004). Global online shopping: How
well protected is the Australian consumer? Australian Competition and Consumer Law Journal,
12(2), 163-190.
State of California. (2007). Privacy laws. Retrieved September 26, 2007, from http://www.
privacy.ca.gov/lawenforcement/laws.htm
Stoney, M. A. S., & Stoney, S. (2003). The problems
of jurisdiction to e-commercesome suggested
strategies. Logistics Information Management,
16(1), 74-80.
Sullivan, B. (2007). Spam is back, and worse
than ever. Retrieved March 20, 2007, from
http://redtape.msnbc.com/2007/01/spam_is_
back_an.html
Teo, T. S. H. (2002). Attitudes toward online shopping and the internet. Behaviour & Information
Technology, 21(4), 259-271.
Treasury (Australia). (2006). The Australian
guidelines for electronic commerce (March 2006).
Canberra, ACT: Treasury (Australia)
Udo, G. J. (2001). Privacy and security concerns
as major barriers for e-commerce: A survey study.
Information Management & Computer Security,
9(4), 165-174.
United Nations. (1948). Universal declaration of
human rights. New York: United Nations.
U.S. Senate Permanent Subcommittee on Investigations. (1986). Security in cyberspace. Retrieved
April 3, 2007, from http://www.fas.org/irp/congress/1996_hr/s960605t.htm



Vaile, D. (2004). Spam cannednew laws for


Australia. Internet Law Bulletin, 6(9), 113-115.
Vasiu, L., Warren, M., & Mackay, D. (2002, December). Personal information privacy issues in
B2C e-commerce: a theoretical framework. Paper
presented at the 7th Annual CollECTeR Conference on Electronic Commerce (CollECTeR02),
Melbourne, Victoria
Walczuch, R., Seelen, J., & Lundgren, H. (2001,
September). Psychological determinants for
consumer trust in e-retailing. Paper presented
at the Eight Research Symposium on Emerging
Electronic Markets (RSEEM01), Maastricht, The
Netherlands.
Williams, D. (2004, 27 Feb). Business guides to
combat spam. Retrieved August 2, 2005, from
http://www.agimo.gov.au/media/2004/02/12070.
html
Williams, D. (2004, 10 March). Maximising the
benefits of the information economy. Retrieved
August 2, 2005, from http://www.agimo.gov.
au/media/2004/03/21377.html
Word of Mouth Marketing Association. (2006,
13 Nov). Dell makes public commitment to word
of mouth ethics. Retrieved January 5, 2007, from
http://www.womma.org/womnibus/007895.php
Working Group on Electronic Commerce and
Consumers (Canada). (1999). Principles of consumer protection for electronic commercea
Canadian framework. Ottawa: Canada Bankers
Association.
Working Group on Electronic Commerce and
Consumers (Canada). (2004). Canadian code of
practice for consumer protection in electronic
commerce. Ottawa: Office of Consumer Affairs,
Industry Canada.
Yianakos, C. (2002). Nameless in cyberspace
protecting online privacy. B+FS, 116(6), 48-49.

Current Measures to Protect E-Consumers Privacy in Australia

Zhang, X. (2005). What do consumers really


know? Communications of the ACM, 48(8), 4448.

Acts
Anti-Phishing Act 2005 (California)
Can-Sapm Act 2002 (USA)
Computer Spyware Act 2004 (California)
Online Privacy Protection Act 2003 (California)
Privacy Act 1988 (Public sector) (Cth)
Privacy Amendment (Private Sector) Act 2000
(Cth) sch 3
Spam Act 2003 (Cth)
Telecommunications (Interception and Access)
Act 1979 (Cth)

AddItIonAl reAdIng
Al Iannnuzzir, J. (2002). Industry self-regulation
and voluntary environmental compliance. Boca
Raton: Lewis Publishers.
Ang, P. H. (2001). The role of self-regulation of
privacy and the internet. Journal of Interactive
Advertising, 1(2), 1-11.
Australian Privacy Foundation. (2006). Identity
checks for pre-paid mobile phones. Retrieved
April 2, 2007, from http://www.acma.gov.au/
webwr/_assets/main/lib100696/apf.pdf
Business for Social Responsibility. (2005, April
2005). Privacy (consumer and employee).
Retrieved December 8, 2006, from http://
www.bsr.org/CSRResources/IssueBriefDetail.
cfm?DocumentID=50970
Chung, W. C., & Paynter, J. (2002, January).
Privacy issues on the internet. Paper presented
at The 35th Hawaii International Conference on
System Sciences, Hawaii.

Clayton, G. (2000). Privacy evaluation: Dell.


Retrieved July 20, 2006, from http://www.informationweek.com/privacy/dell.htm
Coonan, H. (2005, November). Self-regulation and
the challenge of new technologies. Paper presented
at the AMTA Annual general Meeting Luncheon,
Sheraton on the Park, Sydney, NSW.
Drake, W. J. (2004). ICT global governance and the
public interest: Transactions and content issues.
Geneva, Switzerland: Computer Professionals for
Social Responsibility.
Drezner, D. W. (2004). The global governance of
the internet: Bringing the state back in. Political
Science Quarterly, 119(3), 477-498.
Gunningham, N., & Rees, J. (1997). Industry
self-regulation: An institutional perspective. Law
& Policy, 19(4), 363-414.
Ha, H. (2006). Regulation. In Encyclopedia of
World Poverty, 2, 903-906). Thousand Oaks,
London, New Delhi: Sage Publications.
Lacey, D., & Cuganesan, S. (2004). The role of
organizations in identity theft response: The organization-individual victim dynamic. Journal
of Consumer Affairs, 38(2), 244-261.
Linnhoff, S., & Kangenderfer, J. (2004). Identity
theft legislation: The fair and accurate credit
transactions act of 2003 and the road not taken.
Journal of Consumer Affairs, 38(2), 204-216.
Marshall, A. M., & Tompsett, B. C. (2005). Identity theft in an online world. Computer Law &
Security Report, 21, 128-137.
OECD. (2001). OECD guidelines on the protection of privacy and transborder flows of personal
data. Paris: OECD.
OECD. (2006). Report on the implementation of
the 2003 OECD guidelines for protecting consumers from fraudulent and deceptive commercial
practices across borders. Paris: OECD.



Current Measures to Protect E-Consumers Privacy in Australia

Office of Communications (UK). (2006). Online


protection: A survey of consumer, industry and
regulatory mechanisms and systems. London:
Office of Communications (UK).
ONeill, B. (2001). Online shopping: Consumer
protection and regulation. Consumer Interests
Annual, 47, 1-5.
Price, M. E., & Verhulst, S. G. (2004). Self regulation and the internet. Alphen aan den Rijn, The
Netherlands: Kluwer Law International.
Schwarts, P. M. (1999). Privacy and democracy
in cyberspace. Vanderbilt Law Review, 52, 16091702.
Simpson, S., & Wilkinson, R. (2003, September
2003). Governing e-commerce: Prospects and
problems. Paper presented at the 31st Telecommunications Policy Research Conference, Communication, Information and Internet Policy,

8

National Centre for Technology and Law, George


Mason University, School of Law, Arlington,
VA.
Stafford, M. R. (2004). Identity theft: Laws,
crimes, and victims. Journal of Consumer Affairs,
38(2), 201-203.
Sylvan, L. (2002, September). Self-regulation
whos in charge here? Paper presented at the
Australian Institute of Criminology Conference
on Current Issues in Regulation: Enforcement
and Compliance, Melbourne, Victoria.
Sylvan, L. (2004, September). Issues for consumers in global trading. Paper presented at the 26th
International Trade Law Conference, Rydges
Lakeside, Canberra, ACT.

Current Measures to Protect E-Consumers Privacy in Australia

AppendIX A
Principles and Guidelines of Consumer Protection by the United Nations (UN), Organisation
Economic Corporation Development (OECD), European Union (EU), and Asia Pacific Economic
Cooperation (APEC) (based on information in Department of Economic and Social Affairs (UN),
2003; OECD, 2000; European Commission, 2005; Consumer Protection Commission, E. Y. T., n.d.)
Table 1A.
No.

UN(a)

OECD (b)

EU (c)

APEC (d)

Physical safety

Transparent and effective


protection

Buy what you want, where you want

International cooperation

Promotion and
protection of
consumers economic
interests

Fair business, advertising,


and marketing practices

If it does not work, send it back

Education and awareness

Standards for the


safety and quality of
consumers goods and
services

Online disclosures
information about the
business, the goods or
services, the transaction

High safety standards for food and other


consumer goods

Private sector leadership

Distribution facilities
for essential consumer
goods and services

Confirmation process

Know what you are eating

Online advertising and


marketing

Measures enabling
consumers to obtain
redress

Payment

Contracts should be fair to consumers

Online information disclosure


to consumers

Education and
information programs

Dispute resolution and


redress

Sometimes consumers can change their


mind

Confirmation process

Promotion
of sustainable
consumption

Privacy

Making it easier to compare prices

Resolution of consumer
disputes

Measures relating to
specific areas

Education and awareness

Consumer should not be misled

Privacy

Protection while you are on holiday

Security

10

Effective redress for cross-border disputes

Choice of law and jurisdiction

Sources: (a) Department of Economic and Social Affairs (UN). (2003). United Nations Guidelines for Consumer Protection
(as expanded in 1999). New York: United Nations.
(b) OECD. (2000). Guidelines for Consumer Protection in the Context of Electronic Commerce. Paris: OECD.
(c) European Commission. (2005). Consumer Protection in the European Union: Ten Basic Principles. Brussels: European
Commission.
(d) Consumer Protection Commission, E. Y. T. (undated). E-Commerce: APEC Voluntary Online Consumer Protection Guidelines.
Consumer Protection Commission, Executive Yuan (Taiwan). Retrieved April 3, 2007, from http://www.cpc.gov.tw/en/index.
asp?Pagenumber=25

9

Current Measures to Protect E-Consumers Privacy in Australia

AppendIX b
Summary of Eight Principles of Consumer Protection in Canada (based on information in Working
Group on Electronic Commerce and Consumers (Canada), 1999)
Table 2B.
No.

Principles

Information provision

Contract formation

Privacy

Security of payment and personal information

Redress

Liability

Unsolicited commercial e-mail

Consumer awareness

Sources: Working Group on Electronic Commerce and Consumers (Canada). (1999). Principles of Consumer Protection for
Electronic CommerceA Canadian Framework. Ottawa: Canada Bankers Association.

0



Chapter VII

Antecedents of Online Privacy


Protection Behavior:
Towards an Integrative Model
Anil Gurung
Neumann College, USA
Anurag Jain
Salem State College, USA

AbstrAct
Individuals are generally concerned about their privacy and may withhold from disclosing their personal
information while interacting with online vendors. Withholding personal information can prevent online
vendors from developing profiles to match needs and wants. Through a literature review of research on
online privacy, we develop an integrative framework of online privacy protection.

IntroductIon
The latest report on e-commerce by the U.S.
Census Bureau (2007) shows that although there
has been an increase in online purchasing by individuals, the portion of consumer e-commerce
or online to total retail sales is far less than the

portion of electronic business-to-business sales


to the total business-to-business sales. One of
the factors that may be influencing this online
consumer behavior is the privacy concerns that
consumers have regarding the personal data collection procedures used by online companies.
An individuals trust in online companies and

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Antecedents of Online Privacy Protection Behavior

their data collection procedures has been the


major factor hindering the growth of electronic
commerce (Belanger, Hiller, & Smith, 2002; Liu,
Marchewka, Lu, & Yu, 2004).
Companies use the consumer data to study
consumer preferences so that they can build effective strategies to expand their customer base.
Emergent technologies and organizational practices in gathering data raise privacy concerns.
Such technologies include the use of cookies,
authentication programs, spyware, and adware.
The growth of technologies to collect information about consumers may only lead to fueling
the consumers privacy concerns. Companies
have realized that protecting consumers private
information is an essential component in winning the trust of the consumers and is a must in
facilitating business transactions (Belanger et
al., 2002; McKnight & Chervany, 2001). Privacy
policies that inform the consumer about how the
collected information will be used are usually
posted on the websites. However, there is not
enough evidence to prove whether or not these
policies are effective in alleviating the consumers
privacy concerns. In the absence of any strong
mechanisms, technologies or policies that ensure
information privacy, the consumer adopts different strategies for their privacy protection. Such
strategies may include, for instance, abstaining
from purchasing, falsifying information, and adjusting security and privacy settings in the Web
browsers (Chen & Rea, 2004).
In this chapter, we review the existing literature
and analyze the existing online privacy theories,
frameworks, and models. Through the analysis
of the literature, we aim to understand existing
privacy frameworks and variables that are used in
the context of online privacy protection. Finally,
based on the review, we develop an integrative
framework to encapsulate the antecedents to
online privacy protection behavior.
The motivation for this study is to understand
the factors that are related to online privacy
protection. Although this topic has been studied



in other disciplines, such as marketing, (e.g.,


Sheehan, & Hoy, 1999), the literature review
shows that research on privacy is fragmented.
The proposed integrative framework aims to
integrate these fragmented yet related constructs
under one overarching concept. This will help us
in expanding our understanding of the various
issues involved in online privacy. Specifically,
we focus on what has been done in privacy protection and how future studies in this area can
proceed forward.

bAckground
Research has shown that privacy concerns act as
a hindrance to the growth of electronic commerce
(Hoffman, Novak, & Peralta, 1999; Miyazaki &
Fernandez, 2001). In countering privacy concerns,
the Federal Trade Commission has primarily relied
upon fair information practices to guide privacy
regulation in the United States (Milne, 2000).
Fair information practices include the following:
notice of the firms information practices regarding what personal information will be collected
and how the collected information will be used;
choice or consent regarding the secondary use
of the information; accessibility of users to view
their own data collected by companies; security
of the collected data; and enforcement to ensure
that companies comply with fair information
practices.
Research shows that fair information practices have not been effective in alleviating the
privacy concerns of consumers (Culnan, 2000).
In the absence of stricter laws to ensure privacy,
consumers adopt differing strategies to protect
their identity online, for instance, falsification,
passive reaction, and identity modification (e.g.,
Sheehan & Hoy, 1999). For the purpose of this
chapter, the strategies adopted by consumers to
protect their identity are defined under the general
term of privacy protection behavior in an
online environment.

Main Research Questions

How do information
privacy concerns affect the
growth and development
of consumer-oriented
commercial activity on the
internet?

What actions are taken by


online consumers in response
to their privacy concerns?

What is the extent of online


retailer disclosures of various
privacy and security related
practices?

What is the extent to


which consumer-oriented
commercial Web sites post
disclosures that describe their
information practices and
whether these disclosures
reflect fair information
practices?

Develop a privacy research


framework that highlights
key dimensions of the
information interaction
between marketers and
consumers.

What are the underlying


factors of online privacy
concerns?

Authors &
Year

(Hoffman et
al., 1999)

(Sheehan &
Hoy, 1999)

(Miyazaki &
Fernandez,
2000)

(Culnan,
2000)

(Milne, 2000)

(Sheehan &
Hoy, 2000)
Survey

Conceptual

Survey

Survey

889

361

128

Awareness
of collection,
information
usage, sensitivity
of information,
familiarity, and
compensation

Marketer information
strategy

Personal identifying
information

Privacy related
statements,
security-related
statements, consumer
perceptions

Privacy concerns,
situational contexts

889

Survey

IV

Information privacy,
environmental
control, and
secondary use of
information control

Conceptual

Method

Consumer
information
behavior

Information practices

Online disclosures and


information practices

Fair information practices

Online disclosures

Purchase
likelihood

Information
practice
and privacy
policy
disclosures

There is some correlation between privacy


concerns and consumer complaining
behavior such as flaming, complaining,
or abstaining from participating in online
activities. The most frequently adopted
complaining behavior was providing
incomplete information when registering
for Web sites.

Consumer complaining
behavior

Consumer
behavior
(i.e.,
falsifying
information,
reading
unsolicited
e-mail

Three important factors are: control over


collection and usage of information;
short-term transactional relationship; and
established long term relationship.

The main types of interactions are


information requests/disclosures,
information provision, information
capturing without consent, and information
practices.

The most of the Web sites surveyed notified


about their information practices but did
not fully disclose fair information practices.
Not having a fully-agreed definition for fair
information practices pose challenges in
assessing online disclosures.

A positive relationship exists between the


percentage of privacy and security related
statements on Web sites for particular
online shopping categories and consumers
online purchase likelihoods.

The opt-in, informed consent policies


are beneficial for online businesses. The
most effective way for commercial Web
providers to develop profitable exchange
relationships is gaining consumer trust.

Relationship exchange

Policy,
Protection

Findings

Theoretical Framework

DV

Antecedents of Online Privacy Protection Behavior

Table 1. A review of online information privacy literature

continued on following page





What are the important


features of business to
consumer Web sites?

What are the differences in


privacy concerns of online
users?

What is the impact of


customer perceptions of
security control on ecommerce acceptance?

(Ranganathan
& Ganapathy,
2002)

(Sheehan,
2002)

(Suh & Han,


2003)
Survey

Survey

Survey

502

889

214

100

Authentication,
nonrepudiation,
confidentiality,
privacy protection,
data integrity, trust,
attitude, behavioral
intention

Awareness,
usage, sensitivity,
familiarity, and
compensation

Information content,
design, security, and
privacy

Choice, access,
security, and notice

Study the content of online


privacy notices to inform
public policy

(Milne &
Culnan, 2002)

Survey

Trustworthiness, site
quality, privacy, and
security features

Experiment

What is the importance


of third party privacy
seals, privacy statements,
third party security seals,
and security features on
purchasing behavior of
consumers? What role does
trustworthiness play in
consumer behavior?

(Belanger et
al., 2002)

140

Disposition to trust,
institution-based
trust, trusting beliefs,
trusting intentions,
and Web vendor
interventions (i.e.,
third party seals,
privacy policy)

Conceptual

What are different typologies


of trust and how do they
relate with e-commerce
consumer behavior?

Internet experience,
purchasing method,
risk concerns

(McKnight
& Chervany,
2001)

160

Survey

How do risk perceptions vary


with Internet experience?
What is the effect of risk
perceptions on online
shopping activity?

(Miyazaki &
Fernandez,
2001)

Actual use

Purchase
intent

Information
disclosure

Technology acceptance
model

Information practices

Information practices

Fair information practices

Information practices

Customer perceived strength of


nonrepudiation, privacy protection, and data
integrity was important for determining
e-commerce acceptance.

The privacy concerns of consumers


vary depending upon the situation. The
contextual nature of online privacy makes
it difficult to predict how online users will
react to specific online situations.

Security is the best predictor of online


purchase intent followed by privacy, design
and information content.

Effective privacy notice is the first step


towards privacy protection. The amount of
Web sites that posted privacy notice grew
from 1998 to 2001.

Security features are more important than


privacy and security seals. Trustworthiness
of Web merchants is important.

A trust model is presented which helps to


study consumer trust at levels of personal,
institutional, and interpersonal.

Theory of reasoned action

Trust related
Internet
behavior

Intention to
purchase

This study indicates that higher levels of


Internet experience may lead to lower risk
perceptions regarding online shopping and
fewer specific concerns regarding system
security and online retailer fraud, yet more
privacy concerns.

Information practices

Online
purchasing
rate

Antecedents of Online Privacy Protection Behavior

Table 1.continued

continued on following page

23

Interview

What is the perception of


Internet users regarding
privacy? What are the
implications of gathering
information by offering
financial benefits?

What is the relationship


between privacy risk
beliefs and confidence and
enticement beliefs that
influence the intention to
disclose information?

Do consumers value privacy


statements and privacy seals?
If so, do these statements
and seals affect consumer
disclosure of personal
information?

Do information transparency
features, which provide
knowledge of information
and procedures, affect
willingness for information
disclosure?

(Olivero &
Lunt, 2004)

(Dinev &
Hart, 2006)

(Hui, Teo, &


Lee, 2007)

(Awad &
Krishnan,
2006)
Survey

Experiment

Survey

293
&
449

Survey and
experiment

What is the nature and


dimensions of Internet
users information privacy
concerns?

(Malhotra,
Kim, &
Agarwal,
2004)

401

109

369

212

Experiment

Do privacy seals ease


privacy concerns of online
customers?

(Liu et al.,
2004)

102

Survey

What types of privacy


control techniques are used
in an online context?

(Chen & Rea,


2004)

Information
transparency, privacy
concern, privacy
policy, and previous
privacy invasion

Privacy statement,
privacy seal,
monetary incentive,
sensitivity of
information

Privacy concerns,
trust, privacy risk,
personal interest

Attitude toward
privacy, control,
perceived risk,
and awareness
of information
collection

Collection, control,
awareness, type of
information, trust
beliefs, risk beliefs

Notice, access,
choice, security, and
trust

Concerns of
unauthorized
use Concerns of
giving out personal
information

Willingness
to be
profiled

Information
disclosure

Willingness
to disclose
information

Information practices

Contemporary choice
theory

Privacy calculus

Online disclosures

Social contract theory

Behavioral
intention

Willingness
to disclose
information

Theory of reasoned action

Information practices

Behavioral
intention to
purchase

Privacy
controls

Customers who desire greater information


transparency are less willing to be profiled.

The existence of privacy statement was


effective for information disclosure while
that of privacy seal was not. Monetary
incentive was positive influence on
disclosure. Information request had a
negative influence on disclosure.

Privacy concerns inhibit e-commerce


transactions. Trust and personal interest
outweigh privacy risk perceptions
while deciding on personal information
disclosure.

Perceived risk and awareness of


information collection are related with a
shift in concerns from trust issues to control
issues. Risk awareness reduced the level of
trust and increased the demand for control.

The second order Internet users


information privacy concerns scale is
developed with dimensions of collection,
control, and awareness. Privacy concerns
will have negative influence the willingness
to have relationships with online
companies.

Privacy concerns have strong influence


on whether an individual will trust an
electronic commerce business. Trust
will influence the behavioral intention to
purchase online.

Passive control was related to the concern


of unauthorized use of personal information
and identity modification was related to the
concern of giving out personal information.

Antecedents of Online Privacy Protection Behavior

Table 1.continued



Antecedents of Online Privacy Protection Behavior

revIeW of fIndIngs
The methodology followed for this chapter was
a literature review. In this conceptual study,
the existing privacy and related literature was
analyzed to identify existing frameworks and
variables related to online privacy. In the review
of the literature, we retained the studies where
privacy was in the context of online, and the
unit of analysis was either individual and/or online
consumers. The results of the literature review
are presented in Table 1. The research articles that
were considered for the review were published
from 1999 onwards. This was necessary, since the
popular media has been ripe with news coverage on
heightened privacy concerns of consumers since
that time. Most of the research studies included
in the review used a survey methodology, while
experiments were the second most frequently
used methodology.
Our review of the literature on privacy revealed
that most of the research studied the consumers
willingness to disclose information in light of
their privacy concerns (Dinev & Hart, 2006;
Hui et al., 2007; Malhotra et al., 2004; Milne &
Culnan, 2002; Olivero & Lunt, 2004). There were
other group of literature that studied the consumers willingness to purchase in light of privacy
concerns (Belanger et al., 2002; Miyazaki &
Fernandez, 2001; Suh & Han, 2003). There were
very few studies that actually studied privacy
protection behavior (Chen & Rea, 2004). The
review of current research shows that privacy
concerns affect the disclosure of information
or purchase intent of consumers (Belanger et
al., 2002; Malhotra et al., 2004). The reviewed
literature gives us insights into how the privacy
construct is used with other related constructs
from different perspective. Therefore, we feel
it is necessary that an integrative framework of
privacy be proposed. This framework would be
helpful to study in more completeness, the impact
of privacy on consumer behavior. As outlined
in the beginning, the proposed framework will



attempt to explain the antecedents that lead to


privacy protection behavior. Since only one
study specifically examined the privacy protection behavior, we feel that a discussion on how
privacy concerns will affect consumer behavior is
relevant before outlining a framework on privacy
protection behavior. Therefore, we first proceed
to discuss the existing typologies of privacy concerns. These typologies explain both the states
and types of privacy concerns in an individual.
A mixed typology is put forth that combines both
the states and types of privacy concerns.

typology of privacy concerns


Several privacy typologies have been suggested,
such as privacy aware, privacy active, and privacy suspicious (Drennan, Mort, & Previte, 2006).
Privacy aware refers to being knowledgeable
and sensitive about risks associated with sharing
personal information online. The privacy aware
factor consists of selectivity about information
provision, awareness of sensitivity of mothers
maiden name, and perceptions that online companies require an excessive amount of personal
information. The privacy active factor refers to
active behaviors adopted by consumers in regards
to their privacy concerns. This factor consists of
seeking detailed information about online purchasing, requesting that companies do not share
collected personal information, and regularly
changing passwords to protect ones privacy. The
privacy suspicious factor refers to concerns about
company behavior regarding privacy practices.
This factor consists of awareness of companies
plans to share collected personal information,
belief that company privacy policies are hard
to find in their Web sites and checking to make
sure that e-mail and phone numbers are provided
online before transactions. In summation, these
typologies seem to be related to state or degree
of privacy concerns that exist in individuals.
In addition to the privacy typologies described,
other typologies have also been suggested in lit-

Antecedents of Online Privacy Protection Behavior

Table 2. Mixed typology of privacy concerns


Fundamentalists & privacy aware

Fundamentalists & privacy active

Fundamentalists & privacy suspicious

Unconcerned & privacy aware

Unconcerned only

Unconcerned only

Pragmatists & privacy aware

Pragmatists & privacy active

Pragmatists & privacy suspicious

erature. For instance, as it appears in the report


by Federal Trade Commission (1996), categorizes
consumers into groups such as fundamentalists,
unconcerned, and pragmatists as suggested
by Westin (1967). Fundamentalist individuals
prefer privacy controls over consumer benefits
and comprise one fourth of the population. They
are unlikely to partake in any activities that will
compromise their privacy. The unconcerned individuals fall in the other extreme and also comprise
one fourth of the population. They are willing to
forego their privacy if they can enjoy any consumer benefits. Such individuals are most likely
to join reward programs and are more willing to
divulge their personal information in order to get
discounts. The other half of the population is comprised of pragmatists who weigh the advantages
of various consumer benefits against the degree
of personal information sought by companies.
Building upon Westins typology, Sheehan (2002)
suggested unconcerned, circumspect, wary, and
alarmed as privacy typologies. The unconcerned
users have the most minimal privacy concern.
They are willing to provide accurate information to online companies. The circumspect have
minimal privacy concerns, however, they are more
likely than unconcerned to provide incomplete
information to online companies during registration. The wary have a moderate privacy concern
and are likely to provide incomplete information
during registration. The alarmed users are highly
concerned. Even if they register for Web sites,
they are more likely to provide incomplete or
inaccurate information.
In our analysis, Westins and Sheehans privacy
typology relates to the state or degree of privacy
concerns. Some are too concerned while some are
not concerned at all, with the rest of the popula-

tion falling somewhere between the extremes.


On the other hand, the typology suggested by
Drennan et al.(2006) is behavior-centric as they
refer to behavior in response to privacy concerns.
Rather than being mutually exclusive, these two
suggested typologies are related. This relatedness
is illustrated in the three-by-three matrix in Table
2. In the second row of the table, the second and
third cells have only unconcerned. We believe
that if consumers are unconcerned about their
privacy, they are less likely to be privacy active or
privacy suspicious, although they may be aware
of privacy issues. For our degree of privacy concern typology, we have followed Westins (1967)
typology instead of Sheehans, for its conciseness.
Moreover, in order to form a mixed typology, we
combined Westins typology with the typology
suggested by Drennan et al. (2006).

privacy protection
Approaches taken by individuals to protect their
personal information online may be passive or
active. Passive protection may involve depending upon external entities such as government or
private institutions and not adopting any privacy
protection by oneself. As the name suggests, active
protection involves using different measures for
privacy protections. Some of the privacy protection strategies are as follows: use personal firewalls, withhold information to a Web site, remove
name and address from mailing lists, inform Web
sites not to share information, avoid using a Web
site, disable cookies, use anti-spyware tools, and
provide false or incomplete information when
registering on a Web site. Privacy protections
can be viewed from three perspectives: preroga-



Antecedents of Online Privacy Protection Behavior

tive, objective, and subjective (Yao, 2005). The


prerogative privacy is enforced at such a broad
level by the government that it is hard to link it
to beliefs, attitudes, and behaviors of individuals.
The objective privacy focuses on the effectiveness of privacy protection strategies such as the
ones set by the Federal Trade Commission. The
subjective privacy can be addressed by specific
human efforts taken to protect privacy online.
Table 3 shows the different perspectives on privacy protection.
Since there are no means to help users determine for themselves what information to share and
with whom to share and control the dissemination
of information, the consumers have resorted to
other methods in order to protect their personal
information and still receive goods and services
from online vendors (Chen & Rea, 2004). Described as privacy controls, Chen & Rea (2004)
developed three factors that relate to different
behaviors adopted by consumers to protect their
personal information online. The three factors are
falsification of private information, passive reaction, and identity modification. Privacy controls
are defined as consumers ability to hold control
over an unwanted presence in the environment
(Goodwin, 1991).

findings to provide insight into the phenomenon


of online privacy protection behavior.

Dependent Variable
Several differing factors that contribute to the
overall behavior of an individual to protect their
privacys have been discussed in literature (Chen
& Rea, 2004). The first factor, falsification, refers
to altering ones personal information and removing browser cookies when registering for online
Web sites. The second factor, passive reaction,
refers to just ignoring or deleting the intrusion
of others. The third factor, identity modification,
refers to changing ones personal identity by using gender-neural identities or multiple identities
when registering for online services.

Independent Variables
Our literature analysis showed that a wide range
of variables have been used to predict online
privacy protection behavior. These predictor
variables can be classified as privacy concerns,
Internet experience, demographics, and awareness of privacy issues as shown in Figure 1.

Privacy Concerns
frAmeWork
In this section, we propose an integrative framework for online privacy protection behavior.
The proposed framework, as shown in Figure 1,
builds upon prior research and integrates research

Privacy concerns arise from the fear that the faith


that consumers put in online companies will be
violated. When companies gather personal information from consumers, there is an implied social
contract that companies will act upon the collected
information as they have agreed (Phelps, Nowak,

Table 3. Perspectives on online privacy protection

8

Prerogative

Objective

Subjective

political or legal issue that


can be addressed by
philosophical, political, or
legal debates

can be addressed by
measuring effectiveness
of privacy protection
strategies

can be addressed by
focusing on factors that
determine the adoption of
specific privacy protection
strategies

Antecedents of Online Privacy Protection Behavior

& Ferrell, 2000). The implied social contract is


violated if information is collected without consumers awareness, if the collected information
is used for purposes other than those of which the
consumer has been informed, or if the collected
information is shared with third parties without
consumers consent (Phelps, DSouza, & Nowak,
2001; Smith, Milburg, & Burke, 1996; Smith,
1996). Because of privacy concerns, consumers
are unwilling to disclose personal information
to online companies. The consumers unwillingness to disclose can be attributed to perceived
lack of environmental control and information
control (Goodwin, 1991). The perceived lack of
information control is related to privacy concerns
and is central to the issue of this chapter, while
environmental control is related to security concerns (Gurung, 2006).
One of the few studies that examined the
relationship between online privacy concerns
and online behavior found some significant correlations (Sheehan & Hoy, 1999). As privacy
concerns increased consumers were less likely
to register for Web sites, more likely to provide
incomplete information to Web sites, more likely
to report spam, more likely to request removal
from mailing lists, and more likely to send highly
negative messages or flames to those sending
unsolicited e-mail (Sheehan & Hoy, 1999). Privacy
concerns regarding unauthorized use and concerns
of giving out personal information were found to
be significant with privacy protection behavior
(Chen & Rea, 2004).

Internet Experience
As consumers become more experienced in using
the Internet, they are likely to become familiar with
privacy protection strategies. The relationship
between Internet experience of an individual and
their adoption of privacy protection strategies has
been suggested in literature, since the Internet
experience helps to increase behavioral control
which is considered significant in the prediction

of privacy protection behavior (Yao, 2005).


Internet experience has been linked to use of
privacy protection in literature (Yao, 2005). Such
past behavior only helps to reinforce the behavioral
control that one has over the privacy protection
behavior, and thus acts as a predictor of future
privacy protection behavior.

Demographics
Among demographic variables of age, gender,
and race, Chen and Rea (2004) found that gender
and race are significant factors in privacy protection behavior. Their findings suggest that male
consumers are more likely to falsify personal
information than are female users. Their results
further implied that data quality of personal
information collected online may vary among
racial groups. Phelps et.al (2000) found that among
demographic variables such as gender, marital
status, age, education, employment status, and
income, only the education was significant with
privacy concerns. They reported that respondents
who had vocational or some college education
were associated with highest levels of privacy
concern. This further supports the contention
that demographic variable may be related to the
privacy protection behavior.

Awareness
Consumers may be more likely to adopt privacy
protection behavior if they are aware of the
malpractices of online companies and the extent
and the severity of privacy violations that could
occur. In their research about anti-spyware tools,
Hu and Dinev (2005) found that awareness was a
key predictor to anti-spyware adoption behavior.
Users were likely to run anti-spyware tools only
when they became aware that their personal computers were infected with spyware and when they
were aware of the negative consequences posed
by spyware. The concept of awareness has been
defined as the initial stage in the innovation dif-

9

Antecedents of Online Privacy Protection Behavior

Figure 1. Framework for the antecedents of online privacy protection behavior


Privacy
concerns
Internet
experience
Demographics

Online privacy
protection behavior

Awareness

fusion process model (Rogers, 1995). Dinev and


Hu (2007) suggest that awareness may be related
to situational awareness and problem solving
process which includes identifying the problem,
raising consciousness, and resolving the problem.
In their research on protective information technologies, Dinev and Hu (2007) found that awareness is a significant predictor for the adoption of
anti-spyware tools. Therefore, the prior research
findings suggest that awareness may be related to
the online privacy protection behavior.

dIscussIon
The proposed framework provides a comprehensive approach for studying online privacy
protection behavior of consumers. There has
been much literature assessing the consumers
concerns about privacy. We all know that consumers are concerned about their privacy in general.
What we need to understand is how consumers
are dealing with these concerns. The framework
proposes that apart from privacy concerns, Internet experience, demographics, and awareness
are also important antecedents to the prediction of
online privacy protection behavior. Only those
variables that have been researched in the past

0

have been included, and these variables should not


be looked as the exhaustive list of variables that
are important predictors for protection behavior.
There may be other important factors that could
further contribute to the prediction of online
privacy protection behavior. One such variable
is self-efficacy. The developed framework in this
study can be used to formulate and test various
hypotheses related to the adoption of privacy
protection behavior. The defined variables in
the framework may also be applicable to the
development of research models of e-commerce
adoption and information disclosures.
Identification of factors that are related to
consumers privacy behavior would help companies to formulate policies and strategies that
could influence consumer satisfaction and thereby
increase their confidence. Consumers disclose
information based on their personal assessment
of the risks and the benefits. Having a better
idea of the privacy-related variables will help
companies to focus on significant areas where
they can foster their relationship with consumers. By understanding the mechanisms used by
consumers for privacy protection, the companies
can develop policies to raise consumer confidence
and to increase the probability of disclosure of
their personal information.

Antecedents of Online Privacy Protection Behavior

possIble future reseArch


dIrectIons
Future research can be undertaken to empirically
validate the proposed framework. Either surveys
or experiments can be conducted to collect the
empirical data to test the model. One can also
expand the model by adding other relevant variables such as self-efficacy, previous privacy violation, and personality traits. Privacy concerns of
consumers may vary based on their culture. The
cultural differences regarding privacy concerns
and how this will affect privacy protection
behavior can be another avenue used to further
this research.

conclusIon
This research was undertaken with the objective of
investigating how online privacy has been studied. A framework for online privacy protection
behavior was proposed based on the literature
review. The proposed framework provides us with
a roadmap to further analyze and understand the
privacy concerns of consumers and consequent
strategies taken by consumers to protect their
privacy online. Implications for research and
practice were discussed. Further, we hope that the
proposed research directions will help to encourage more research in this exciting area.

references
Awad, N. F., & Krishnan, M. S. (2006). The
personalization privacy paradox: An empirical
evaluation of information transparency and the
willingness to be profiled online for personalization. MIS Quarterly, 30(1), 13-28.
Belanger, F., Hiller, J. S., & Smith, W. J. (2002).
Trustworthiness in electronic commerce: The role
of privacy, security, and site attributes. Journal

of Strategic Information Systems, 11(3-4), 245270.


Chen, K., & Rea, A. I. J. (2004). Protecting
personal information online: A survey of user
privacy concerns and control techniques. Journal
of Computer Information Systems, 44(4), 85-92.
Culnan, M. (2000). Protecting privacy online: Is
self-regulation working? Journal of Public Policy
& Marketing, 19(1), 20-26.
Dinev, T., & Hart, P. (2006). An extended privacy
calculus model for e-commerce transactions.
Information Systems Research, 17(1), 61-80.
Dinev, T., & Hu, Q. (2007). The centrality of
awareness in the formation of user behavioral
intention toward protective information technologies. Journal of the Association for Information
Systems, 8(7), 386-408.
Drennan, J., Mort, G. S., & Previte, J. (2006).
Privacy, risk perception, and expert online behavior: An exploratory study of household end
users. Journal of Organizational and End User
Computing, 18(1), 1-22.
Federal Trade Commission. (1996). Consumer
information privacy hearings.
Goodwin, C. (1991). Privacy: Recognition of
a consumer right. Journal of Public Policy &
Marketing, 10(1), 149-166.
Gurung, A. (2006). Empirical investigation of the
relationship of privacy, security and trust with
behavioral intention to transact in e-commerce.
Unpublished Dissertation, University of Texas at
Arlington, Arlington.
Hoffman, D. L., Novak, T. P., & Peralta, M. A.
(1999). Information privacy in the marketspace:
Implications for the commercial uses of anonymity on the web. The Information Society, 15(4),
129-139.
Hu, Q., & Dinev, T. (2005). Is spyware an internet
nuisance or public menace? Communications of
the ACM, 48(8), 61-66.


Antecedents of Online Privacy Protection Behavior

Hui, K.-L., Teo, H. H., & Lee, S.-Y. T. (2007). The


value of privacy assurance: An exploratory field
experiment. MIS Quarterly, 31(1), 19-33.
Liu, C., Marchewka, J., Lu, J., & Yu, C. (2004).
Beyond concern: A privacy-trust-behavioral
intention model of electronic commerce. Information & Management, 42(1), 127-142.
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004).
Internet users information privacy concerns
(IUIPC): The construct, the scale, and a causal
model. Information Systems Research, 15(4),
336-355.
McKnight, D. H., & Chervany, N. L. (2001). What
trust means in e-commerce customer relationships: An interdisciplinary conceptual typology.
International Journal of Electronic Commerce,
6(2), 35-59.
Milne, G. R. (2000). Privacy and ethical issues
in database/interactive marketing and public
policy: A research framework and overview of
the special issue. Journal of Public Policy &
Marketing, 19(1), 1-6.
Milne, G. R., & Culnan, M. J. (2002). Using the
content of online privacy notices to inform public
policy: A longitudinal analysis of the 1998-2001
U.S. web surveys. The Information Society, 18(5),
345-359.
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online
retailer disclosures. Journal of Public Policy &
Marketing, 19(1), 54-61.
Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. The Journal of Consumer
Affairs, 35(1), 27-44.
Olivero, N., & Lunt, P. (2004). Privacy versus
willingness to disclose in e-commerce exchanges:
The effect of risk awareness on the relative role of
trust and control. Journal of Economic Psychology, 25(2), 243-262.



Phelps, J. E., DSouza, G., & Nowak, G. J. (2001).


Antecedents and consequences of consumer
privacy concerns: An empirical investigation.
Journal of Interactive Marketing, 15(4), 2-17.
Phelps, J. E., Nowak, G. J., & Ferrell, E. (2000).
Privacy concerns and consumer willingness to
provide personal information. Journal of Public
Policy & Marketing, 19(1), 27-41.
Ranganathan, C., & Ganapathy, S. (2002). Key
dimensions of business-to-consumer web sites.
39(6), 457-465.
Rogers, E. M. (1995). Diffusion of innovations
(4th ed.). New York: Free Press.
Sheehan, K. B. (2002). Toward a typology of
internet users and online privacy concerns. The
Information Society, 18(1), 21-32.
Sheehan, K. B., & Hoy, M. G. (1999). Flaming,
complaining, abstaining: How online users respond to privacy concerns. Journal of Advertising,
28(3), 37-51.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions
of privacy concern among online consumers.
Journal of Public Policy & Marketing, 19(1),
62-73.
Smith, H. J. (1996). Information privacy: Measuring individuals concerns about organizational
practices. MIS Quarterly, 20(2), 167-196.
Smith, H., Milburg, S., & Burke, S. (1996). Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly,
20(2), 167-196.
Suh, B., & Han, I. (2003). The impact of customer
trust and perception of security control on the acceptance of electronic commerce. International
Journal of Electronic Commerce, 7(3), 135-161.
U.S. Census Bureau. (2007). The census bureau
of the department of commerce report on retail
e-commerce sales.

Antecedents of Online Privacy Protection Behavior

Westin, A. (1967). Privacy and freedom. New


York: Atheneum.
Yao, M. Z. (2005). Predicting the adoption of
self-protections of online privacy: A test of an
expanded theory of planned behavior model. Unpublished dissertation, University of California,
Santa Barbara.

AddItIonAl reAdIng
Culnan, M. J. (1993). How did they get my
namean exploratory investigation of consumer
attitudes toward secondary information use. MIS
Quarterly, 17(3), 341-361.
Culnan, M., & Armstrong, P. (1999). Information privacy concerns, procedural fairness, and
impersonal trust: An empirical investigation.
Organization Science, 10(1), 104-115.
Greenaway, K. E., & Chan, Y. E. (2005). Theoretical explanations for firms information privacy
behaviors. Journal of Association for Information
Systems, 6(6), 171-198.

Luo, X. (2002). Trust production and privacy


concerns on the internet: A framework based
on relationship marketing and social exchange
theory. Industrial Marketing Management, 31(2),
111-118.
Milne, G. R., & Culnan, M. J. (2002). Using the
content of online privacy notices to inform public
policy: A longitudinal analysis of the 1998-2001
U.S. web surveys. The Information Society, 18(5),
345-359.
Stewart, K. A., & Segars, A. H. (2002). An empirical examination of the concern for information privacy instrument. Information Systems
Research, 13(1), 36-49.
Udo, G. J. (2001). Privacy and security concerns
as major barriers for e-commerce: A survey study.
Information Management & Computer Security,
9(4), 165-174.
Wang, H., Lee, M. K., & Wang, C. (1998). Consumer privacy concerns about internet marketing.
Communications of the ACM, 41(3), 63-70.
Warren, S. D., & Brandeis, L. D. (1890). The right
to privacy. Harvard Law Review, 4(5), 193-220.

Henderson, S. C., & Snyder, C. A. (1999). Personal information privacy: Implications for mis
managers. 36(4), 213-220.



Section III

Empirical Assessments



Chapter VIII

Privacy Control and Assurance:

Does Gender Influence Online Information


Exchange?
Alan Rea
Western Michigan University, USA
Kuanchin Chen
Western Michigan University, USA

AbstrAct
Protecting personal information while Web surfing has become a struggle. This is especially the case
when transactions require a modicum of trust to be successfully completed. E-businesses argue that
they need personal information so they can create viable data to tailor user interactions and provide
targeted marketing. However, users are wary of providing personal information because they lack trust
in e-businesses personal information policies and practices. E-businesses have attempted to mitigate
user apprehension and build a relationship base in B2C transactions to facilitate the sharing of personal information. Some efforts have been successful. This chapter presents survey results that suggest
a relationship between gender and how users control personal information. The findings suggest that
e-businesses should modify information and privacy policies to increase information and transactional
exchanges.

IntroductIon
In the past few years we have witnessed the
competing interests of technological convenience,

personal privacy, and e-business needs. Consumers are finding that e-businesses are asking foror
takingmore personal information than they may
be willing to give in order to utilize goods and

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy Control and Assurance

services. E-businesses counter that they only take


necessary information to complete transactions
and perform effective marketing and customization of their products and services.
Consumers want to actively control how much
personal information they disclose depending
on the level of trust inherent in each e-business relationship. A consumer using a familiar
Web site with privacy policies she trusts will be
more willing to divulge crucial information an
e-business needs, such as demographic data and
shopping preferences. E-businesses want to create an atmosphere that will foster this trust and
information sharing.
However, there is a palpable tension between
consumers and e-businesses at the start of a partnership. This tension exists because of a lack of
trust between users and e-businesses. This mistrust is not unfounded. E-businesses have a poor
record when it comes to protecting consumers
privacy online.

privacy and the consumer


The popular Apple iTunes software is no stranger
to privacy indiscretions. In early 2006, Apple
released iTunes version 6.0.2 which included a
new feature called the MiniStore (Borland, 2006).
The MiniStore enabled iTunes to offer customized
user recommendations based on past browsing
and purchases. Granted, this customizable feature
offered a means to enable users to find more personalized selections. However, computer experts
found that in addition to the song selection, unique
data about each user was sent back to Apple via
the MiniStore (McElhearn, 2006). Once this information was found, the softwares user agreement
was analyzed by experts who found no mention of
this particular MiniStore functionality (Borland,
2006). Apple soon recanted and explained to users
how to turn off this feature. In all new versions of
iTunes, MiniStore functionality must be enabled
by users (Apple, 2007).



However, Apples iTunes is once again in the


privacy spotlight. In 2007, researchers discovered
that all DRM-free music purchased via iTunes
embeds each users personal information in the
file (Fisher, 2007). Additional research found that
all iTunes purchases include this information with
no explanation from Apple.
Apple is not the only organization tracking
users information without their knowledge.
Microsoft Windows Media Player stores data
about all users media files that they watch either
online or on DVDs. The Media player encodes
all selections with a Windows Media Player ID
number specific to each user (Festa, 2002; Smith,
2002a). This information is then sent to an online
Microsoft database. These SuperCookies can
be used to track user viewing habits, Web surfing preferences, and other personal information.
While Microsoft denies any plans to use this data
and provides instructions on how to disable this
feature on its support pages (Smith, 2002b), the
feature is on by default until a user completes a
series of steps hidden within a detailed privacy
statement (Microsoft, 2003).
Other companies have also amassed consumers data without their knowledge. In 1999, researchers learned that Comet Cursor was tracking
the clickstreams of over 16 million people who
downloaded the free software (Oakes, 1999). Other
companies that have tracked, or are tracking,
online user movements include RealNetworks,
DoubleClick, HitBox, and X10. Some companies,
such as DoubleClick, had discontinued tracking in
favor of consumer privacy because of lawsuits and
user complaints (Krill, 2002). However, Googles
pending acquisition of DoubleClick raises new
concerns (EPIC, 2007).
Ultimately, while many of these e-businesses
have either changed data collection practices
or written the procedures into privacy policies,
users still are not always aware of the privacy
implications. Moreover, much amassing of data is
conducted without users awareness. Companies
such as WebTrends specialize in offering e-busi-

Privacy Control and Assurance

nesses detailed user data and Web site usage in


order to analyze what users do at the e-business
site (WebTrends, 2007). Users are not usually made
aware of the clickstream tracking, page views, or
other data being collected about them with each
click and view of a Web page within a site.

lack of Implemented privacy


techniques
As early as 1997, the Electronic Privacy Information Center (EPIC) released a study noting that
Web users (a.k.a. surfers) needed to be aware of
Web site usage and privacy policies (EPIC, 1997).
Our proffered examples illustrate that even if users
should be aware of privacy policies if they exist,
e-businesses can use technology to take personal
information without asking, or at least without
intentionally informing consumers.
However, many e-businesses are working to
simultaneously acquire necessary information
and protect consumers personal privacy. In early
2000, some e-businesses began implementing
P3P (platform for privacy preferences) into their
Web sites. P3P is an XML scripting language that
enables e-businesses to code their Web privacy
statements with standardized markup syntax
(W3C, 2007). Using this technology would allow
all Web browsers to find and read the e-businesses
privacy policies. Consumers would then be able to
set acceptable personal privacy criteria and would
be notified by their Web browsers if a Web site
meets these criteria (Levy & Gutwin, 2005). P3P
promised much to consumers because it would
allow them to decide not only if they wanted to
share personal information with a Web site but
also what information to share (Radcliff, 2001).
Unfortunately, the World Wide Web Consortium (W3C) suspended work on the P3P platform
in late 2006 because of a lack of support from
Web browser developers (W3C, 2006). Interestingly enough, the W3C P3P group notes that the
standard is ready for implementation even though
Web browsers currently do not support it. Perhaps
future Web browser versions will.

Because there are no massively-deployed


technologies to assist users who want to protect
personal data, most use passive and active measures during online transactions. Consumers
may not give out information (passive) or supply
false information (active) in order to complete
a transaction. As a result, e-businesses suffer
from lost or incomplete transactions, or partial
or incorrect data. Without assurances about Web
site securityfactoring in privacy and trust considerationsconsumers are unwilling to supply
the necessary personal information in order to
complete an online purchase. A recent Gartner
study estimates that over 2 billion dollars were
lost in 2006 due to consumers security fears over
e-business transactions (Schuman, 2006).

tension between desire to use and


desire to share
Consumers using active and passive measures
underscore the tension between their desire to use
technology and online services, and the level of
information they are willing to provide. Invasive
technology, such as the Media Player and Comet
Cursor, take personal data without a users
permission and not only cause user concern but
also hinder trust between a user and an e-business.
On the other hand, P3P and related technologies
offer the control users need to decide whether or
not to share personal information. In other words,
the more control users have over their personal
information, the more trust invoked between the
entities and the less user concern. P3P offers users
the chance to negotiate with each e-business as
to how much personal information they want to
share (W3C, 2007).
In order to understand how e-businesses can
foster better relationships with consumers, we
studied the correlation matrix among the criteria
of trust, concern, and control as it relates to consumers online privacy. In our research, we have
found a significant difference in gender tendencies and privacy concerns (Chen & Rea, 2004).



Privacy Control and Assurance

This discussion supports and extends the factor


of gender as crucial to relationship negotiation
between consumer and e-business. Using this
information, we can suggest when consumers
might more readily share personal information
with an e-business. We also delve into the relationships between gender and two categories of
privacy constructs: the ability to controlactively
or passivelythe unwanted presence of others
and Internet privacy concerns. Finally, we look
at how we came to these findings, and explore the
implications for future research in this area.

bAckground
Negotiating trust is especially important in this
study because all online interactions require
some level of trust between the consumer and
the e-business. From a simple hyperlink click
to a complex e-commerce purchase, trust must
be negotiated before users are willing to share
personal data in exchange for goods, services,
or information. E-businesses must understand
how to establish and maintain this trust in order
to be successful. A crucial enabling component
for trust is privacy.

privacy
Privacy takes many forms. Some researchers view
it as a moral, legal, or consumer right (Goodwin,
1991; Papazafeiropoulou & Pouloudi, 2001; Han
& Maclaurin, 2002). Others view it within the
context of a social power struggle (Campbell
& Carlson, 2002), economic theory (Hemphill,
2002), or commitment-trust theory (Mukherjee
& Nath, 2007). Some view it as simply a need
to sustain personal space (Gumpert & Drucker,
1998; Clarke, 1999) or a necessary psychological
condition (Yao, Rice, & Wallis, 2007).
This study does not address the privacy needs
of personal space in terms of biometrics, video
monitoring, or workplace surveillance. Instead,

8

we measure the consumers need for online privacy, or information privacy:


Information privacy refers to the claims of individuals that data about themselves should generally not be available to other individuals and
organizations, and that, where data is possessed
by another party, the individual must be able to
exercise a substantial degree of control over that
data and its use. (Clarke, 1999)
Key components can be extracted from this
definition. First we see users concerns that
personal data should not be taken without their
knowledge. Consumers raged at Apples iTunes,
Windows Media Player, Comet Cursor, and
DoubleClick for acquiring information without
active consent. Consumer concern over the uninformed harvest of personal information is high.
This concern quickly correlates to a lack of trust
between users and e-businesses since consumers
experience lack of control over what information
the Web sites have collected about them (Hoffman, Novak, & Peralta, 1999). Numerous studies
discuss the importance of consumers control over
personal information as the basis of establishing
online trust (Wang, Lee, & Wang, 1998; Tavani &
Moor, 2001; Han & Macclaurin, 2002; Hemphill,
2002; Roussos & Moussouri, 2004; Ashrafi &
Kuilboer, 2005; Metzger, 2006) and promoting
consumer privacy. While e-businesses argue
that the collected data will not be used without
consent, acquiring the data without explicitly
asking does not sit well with users and privacy
advocates (Yousafzai, Pallister, & Foxall, 2003;
Duffy, 2005; Flavin & Guinalu, 2006; Pan &
Zinkhan, 2006). Conversely, asking for users
consent can be regarded as a shift of information
control to the users (Eastlick, Lotz, & Warrington,
2006; Van Dyke, Midha, & Nemati, 2007). Many
e-businesses are concerned this control shift will
limit their access to crucial information and are
wary of acceding this power to users.

Privacy Control and Assurance

It follows then that one of the best means of


addressing users concerns and building trust is
for e-businesses to allow users to control their
personal information. However, it is not always
feasible in a business context to permit complete
control; therefore, e-businesses should inform
users via online privacy policies how the collected information will be used. This allows for
informed users to decide whether they should provide personal information in exchange for goods
and services (Han & Maclaurin, 2002; Ashrafi
& Kuilboer, 2005; Flavin & Guinalu, 2006;
Pan & Zinkhan, 2006; Shalhoub, 2006; Lauer &
Deng, 2007). Researchers note that companies
that inform users how their information will be
used begin to build online relationships crucial
for success:
In some respects, the lack of other means in cyberspace of establishing customer relationships
and trust based on reputation and personal contact demands that firms reveal their policies on
information disclosure, informed consent, and
handling disputes. (Schoder & Yin, 2000)
It would seem that a key measurement of how
much personal information users are willing to
share hinges on the trust level in the relationship between users and e-businesses (Roman,
2007).

trust
Before users choose to enter into a relationship
with a business, they must first be convinced that
it is in their best interest. Consumers look at discounts, reputation, and other factors before they
decide they will enter a physical store to conduct
business (So & Sculli, 2002; Duffy, 2005; Eastlick et al., 2006; Chen & Barnes, 2007; Roman,
2007). There must also be some factor of initial
trust before a user will enter into an e-business
transaction:

The initial trust model, which assumes that parties


barely know each other, also seems appropriate
for the distant, impersonal relationships that characterize most Web vendor/customer relationships.
(McKnight, Choudhury, & Kacmar, 2000)
Many studies have looked at methods through
which e-businesses can gain users trust (Hoffman et al., 1999; McKnight et al., 2000; Phelps,
Nowak, & Ferrell, 2000; Schoder & Yin, 2000;
Riegelsberger, Sasse, & McCarthy, 2003; Aiken
& Boush, 2006; Hui, Tan, & Goh, 2006; Pan &
Zinkhan, 2006). Without the means to establish
the trust relationship, there can be no viable
interaction.
Initial trust becomes the challenge for e-businesses as each Web site occupies a single clickspace for users. Current studies look at various
factors e-business can use to foster this initial
trust, such as third-party trustmarks (Noteberg,
Christiaanse, & Wallage, 2003; Hu, Lin, & Zhang,
2003; Kim, Steinfield, & Lai, 2004; Patton &
Josang, 2004; Moores, 2005; Aiken & Boush,
2006), privacy policy statements (Han & Maclaurin, 2002; Ashrafi & Kuilboer, 2005; Flavin
& Guinalu, 2006; Pan & Zinkhan, 2006), brand
reputation (So & Sculli, 2002; Yousafzai et al.,
2003; Duffy, 2005; Aiken & Boush, 2006; Eastlick
et al., 2006; Metzger, 2006; Roman, 2007), and
Web site design and content (Janda, Trocchia, &
Gwinner, 2002; Chen & Barnes, 2007). However,
it is still ultimately the trust factor or lack of
faith noted almost 10 years ago by Hoffman,
Novak, and Peralta (1999) that plays one of the
most crucial roles in e-business/consumer relational success.
Without trust, users will not enter into relationships with e-businesses, and they will not
provide any personal information to e-business
sites: Almost 95% of Web users have declined
to provide personal information to Web sites at
one time or another when asked (Hoffman et
al., 1999). More recent studies share statistics
similar to Hoffman et al. (1999), suggesting that

9

Privacy Control and Assurance

not much has changed in terms of the consumer/


e-business trust relationship (Moores, 2005; Pan
& Zinkhan, 2006).
Trust must be established and maintained in
order to foster and encourage a relationship in the
e-business sphere. Trust is a construct that must
be negotiated between a user and an e-business.
What makes trust difficult is that this negotiation
factor differs not only by the type of e-business
or user but also by each relationship. Studies have
looked at users disposition to trust (McKnight
et al., 2000), relationship exchanges (Hoffman
et al., 1999), and consumer perceptions regarding
the ethics of online retailers (CPEO) (Roman,
2007) to explain how these relationship negotiations can be accomplished.
Another factor that must be considered in
this trust negotiation is gender. Researchers have
found significant differences in gender tendencies
toward Internet usage (Teo & Lim, 1997; Allen,
2000; Nachimias, Mioduser, & Shelma, 2001;
Roach, 2001; Sexton, Johnson, & Hignite, 2002;
Ono & Zavodny, 2005; Chesley, 2006; Kang &
Yang, 2006; Fraser & Henry, 2007), e-business
(Papazafeiropoulou & Pouloudi, 2001; Sexton
et al., 2002; Garbarino & Strahilevitz, 2004;
Ha & Stoel, 2004; Fraser & Henry, 2007), and
exploratory work concerning privacy (Hupfer &
Detlor, 2007; Yao et al., 2007). Our study supports
and extends the factor of gender as crucial to the
relationship negotiation and users attitudes as to
whenor if they will share personal information with an e-business.

mAIn thrust of the chApter


Concerns about giving out personal information
in the e-commerce environment cause low user
involvement in online transactions (Miyazaki &
Fernandez, 2001; Aiken & Boush, 2006). Users
almost always find it unacceptable for marketers to
use their personal information for purposes other
than current, and sometimes future, transactions

0

that require personal information whether it occurs


within traditional marketing methods (Goodwin,
1991; Nowak & Phelps, 1992; Wang & Petrison,
1993) or online (Tavani & Moor, 2001; Flavin
& Guinalu, 2006; Roman, 2007). Both shoppers
and non-shoppers worry about issues of acquisition and dissemination of consumer data (Rohm
& Milne, 1998; Tavani, 1999; Berendt, Gnther,
& Spiekermann, 2005; Mukerjee & Nath, 2007).
These concerns are usually triggered by more than
one catalyst, such as age (Han & Maclaurin, 2002)
or education (Ono & Zavodny, 2005; Flavin &
Guinalu, 2006). However, in many cases, a major
catalyst is linked to gender (Sexton et al., 2002;
Garbarion & Strahilevitz, 2004; Ha & Stoel, 2004;
Chesley, 2006; Fraser & Henry, 2007).

gender-linked privacy Questions


Male users have been reported to use the Internet
more frequently and for a greater number of tasks
than female users (Teo & Lim, 1997; Teo, Lim,
& Lai, 1999; Papazafeiropoulou & Pouloudi,
2001; Nachimias et al., 2001; Sexton et al., 2002;
Flavin & Guinalu, 2006). Although Internet
usage patterns are shifting toward a more equal
gender balance (Roach, 2001; Ono & Zavodny,
2003; Ono & Zavodny, 2005), e-business transactions remain a male-dominated realm (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Fraser
& Henry, 2007). The increased level of Internet
experience and utilization of online tasks has
been negatively related to concerns about online
privacy (Hoffman et al., 1999; Flavin & Guinalu,
2006). However, other studies indicate that online
privacy concerns are more readily attributed to
educational level (Phelps et al., 2000; Flavin &
Guinalu, 2006), or other demographic characteristics, such as marital status (Chesley, 2006), age
(Han & Maclaurin, 2002), or employment (Ono
& Zavodny, 2005).
Ultimately, there is a strong indication that
gender is a major factor relative to privacy concerns. Statistics from an early study show that

Privacy Control and Assurance

87% of female Internet users were very concerned


about threats to their personal privacy while only
76% of male Internet users were very concerned.
Furthermore, women registered higher levels of
concern on every privacy-related issue about
which they were questioned (Ackerman & Lorrie,
1999). More recently, Friedman, Kahn, Hagman,
Severson, and Gill (2006) report in a study, The
Watcher and the Watched, that females were
more concerned than males about all aspects of
privacy. In this study, researchers recorded more
than 900 people passing through a public space.
Afterward, those filmed were asked their reactions to whether these images should be publically
viewed in real-time at the location, in real-time at
another location (anywhere in the world), or saved
and distributed online. Both males and females
had some concerns over one of the three options.
However, in almost all cases, females expressed
privacy concerns over displaying the video and
images no matter what the situational context
or if they were in the role of the watcher or the
watched. Ultimately, females were more aware
of the implications of private information within
public spaces (Friedman et al., 2006).
This awareness can be linked to how women
communicate on the basis of network-oriented and
collaborative tasks while men, on the other hand,
communicate to elevate their social hierarchy
(Kilbourne & Weeks, 1997; Teo & Lim, 1997).
Female users consider e-mail to have a higher
social presence (Gefen & Straub, 1997), and tend
to focus more on messaging and community formation (Teo & Lim, 1997). Mens use of electronic
media involves exchanging messages to disseminate information (Brunner, 1991). Men also focus
more on searching, downloading, and purchasing
(Teo & Lim, 1997; Ha & Stoel, 2004).
Womens use of electronic media for collaborative purposes may involve exchanging
messages that contain personal information,
views, and other information not meant for public
disclosures. Moreover, studies have shown most
women rely on recommendations from social

networks to make purchasing decisions (Garbarino & Strahilevitz, 2004). Studies of women in
female-only discussion groups show women to
be more focused on self-disclosure and individual
opinions; women respond directly to others in
the discussion groups. Conversely, men in maleonly discussion groups do not self-disclose, and
instead argue to win discussions (Savicki, Kelley,
& Lingenfelter, 1996).
Because the content of communication messages largely affects the participants willingness
to share the messages with others, it is likely that
women would prefer more privacy protection
than men. Therefore, we conjecture the following hypothesis:
H1: Female users are more concerned with online
privacy practices than male users.
Consumers can be very uncomfortable sharing their personal data with others because they
are sensitive to disclosing such data (Phelps et
al., 2000; Han & Maclaurin, 2002; So & Sculli,
2002; Duffy, 2005; Aiken & Boush, 2006; Flavin & Guinalu, 2006). The term information
sensitivity refers to the level of privacy concern
an individual feels for a type of data in a specific
situation (Weible, 1993). Despite concerns about
online privacy, consumers do realize that personal
information is important to online marketers.
Consequently, they are willing to provide such
information when Web sites provide privacy
statements explaining how the collected information would be used (Hoffman et al., 1999; Han
& Maclaurin, 2002; Ashrafi & Kuilboer, 2005;
Flavin & Guinalu, 2006; Pan & Zinkhan, 2006).
The disclosure of such privacy statements and
similar constructs, such as trustmarks, is related
to higher levels of involvement in e-commerce
(Miyazaki & Fernandez, 2000; Noteberg et al.,
2003; Hu et al., 2003; Kim et al., 2004; Patton &
Josang, 2004; Moores, 2005; Aiken & Boush,
2006). Thus, online privacy issues are also related
to (1) an individuals willingness to share personal



Privacy Control and Assurance

data and (2) the privacy disclosure statements of


online vendors.
Although some exploratory studies (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Flavin
& Guinalu, 2006; Fraser & Henry, 2007) indicate
a relationship between gender and privacy issues,
they have not demonstrated a direct relationship
between gender and these two types of privacy
issues (willingness and disclosure). Nonetheless,
women have been found to process information
in more detail and thus are more aware of, and
sensitive to, changes in their environments (Meyers-Levy & Maheswaran, 1991). Women are more
likely to be irritated than men by ambient factors
(e.g., background conditions), store design factors
(e.g., aesthetic and functional aspects), and social
factors (e.g., customers in the same environment)
(DAstous, 2000; Garbarino & Strahilevitz, 2004;
Ha & Stoel, 2004).
Even though most of the indicated studies
focused on womens sensitivity to changes in their
physical environments, it is likely this sensitivity
would continue in the virtual environment.
Because of these reasons, we hypothesize that
gender difference may be significant in the context
of online privacy:
H2: Female users are less likely to share their
personal data with online vendors.
H3: Female users are more concerned with the
disclosure of privacy statements on Web
sites.
Consumers ability to control their interactions
within Web sites affects their perceptions of online
security and privacy practices (Hoffman et al.,
1999; Han & Maclaurin, 2002; Nam, Song, Lee,
& Park, 2005; Pan & Zinkhan, 2006). Goodwin
(1991) defines this type of control as control over
unwanted presence in the environment. However,
in some environments, such as the Internet, users
cannot easily control the unwanted presence of
others. For example, users can delete unsolicited



e-mail thus controlling the unwanted presence in


a passive way. Alternatively, users can configure
their e-mail software to filter out unwanted future
e-mail, respond to the sender to opt out of mailing
lists, or report the offense to a third party service,
thereby branding the senders as spammers. In
instant messaging, users can block others from
their friend lists or simply ignore them. Therefore,
even though intrusion into users environments
due to publicly accessible nature of the Internet
is unavoidable, users reactions to such intrusions
can be of two types: active control (block) and
passive control (ignore).
While women consider electronic communications useful, it is men who are more comfortable
actively using software tools for such Internet
activities (Arbaugh, 2000; Sexton et al., 2002).
Existing studies report that the literature discussing the relationship between computer anxiety and
gender is relatively inconclusive (Chua, Chen, &
Wong, 1999; King, Bond, & Blandford, 2002).
However, in terms of online shopping, women have
reported difficulty locating Internet merchandise
and navigating through modern Web interfaces
(Fram & Grady, 1997; Garbarino & Strahilevitz,
2004; Ha & Stoel, 2004). Compounded with fewer
interests in computer-related activities, female
users are less likely to have a favorable attitude
toward computer usage (Qutami & Abu-Jaber,
1997; Nachimias et al., 2001).
In studying gender differences in Web searching, researchers found that men are more active
online and explored more hyperlinks than women
(Teo & Lim, 1997; Large, Beheshti, & Rahman,
2002). Male online users also take control of the
bandwidth by sending longer messages (Herring,
1992) and tend to use the computer more hours
per week than women (Sexton et al., 2002). Thus,
it is likely that women will be engaged in passive
controls to block the unwanted presence of others. Conversely, men, with their strong computer
interests and skills, are more likely to take active
control over any unwanted presence:

Privacy Control and Assurance

H4: Female users are likely to be engaged in


passive control over unwanted presence of
others.
H5: Male users are more likely to actively control
unwanted presence of others.

2004) and Table 4 shows only the survey questions


related to the current study. Each privacy-related
question was measured with a seven-point Likert
scale anchored by strongly agree (1) to strongly
disagree (7).

survey procedure
study procedures
The authors developed a survey (Table 4) based
on existing privacy literature. To our best knowledge, there is no existing study that empirically
covers online privacy issues focusing on users
preferences to control the unwanted presence of
others (either actively or passively), and concerns
on privacy disclosure, revealing personal data, and
online privacy practices. Specifying the domain
of individual privacy concerns and their ability
to control, the authors used the literature review
(detailed in a previous section) as the starting point
for this instrument. Questions for three types of
privacy concerns (revealing personal data, privacy
practice, and disclosure of privacy policy) were
modified from Smith, Milberg, and Burke (1996)
and others, to suit the context of the current study.
The user controls over others presence were
derived from Goodwin (1991) and others. The
survey instrument was internally distributed to
faculty members who specialize in e-commerce
and online privacy to ensure its content validity.
A pilot study followed and the instrument was
further fine-tuned before implementation.

survey structure
The survey begins with a series of questions that
assess the respondents concerns over various
privacy issues and his or her ability to control
personal data and the presence of others. To avoid
the likelihood of associating each participants
personal demographic data with real individuals, questions regarding identifiable information
were separated from the main part of the survey.
This survey is part of a larger study (Chen & Rea,

The survey was administered to undergraduate


students taking a Web site architecture/design
course. Several prerequisite courses had prepared
participants for basic computer literacy, such as
computer software, hardware, and the Internet,
before they were eligible for this junior-senior level
course. Participants were awarded extra points
in class for completing the survey. To encourage
involvement, the participants were promised
anonymity and that their responses would be used
solely for research purposes.
As other studies note, a study sample of more
advanced users from a younger demographic is
useful because it functions as an indicator of
Internet and e-business usage among more Websavvy individuals (Nachimias et al., 2001; Kim et
al., 2004; Moores, 2005; Aiken & Boush, 2006).
With the increase of Web usage in almost all
countries, the study group offers a glimpse into
the future of online consumption. However, an
application of the study to a larger, more general
Web-using population would also prove useful
for measuring our study of controls, and also
indicate shifts in approaches as more users partake in online commerce and general Web usage.
Other studies have demonstrated that even with
a larger demographic, gender tends to play a role
in Internet usage (Teo & Lim, 1997; Roach, 2001;
Sexton et al., 2002; Ono & Zavodny, 2003; Ono &
Zavodny, 2005); in some cases, gender may also
be a factor in e-business participation (Garbarino
& Strahilevitz, 2004; Ha & Stoel, 2004; Chesley,
2006; Fraser & Henry, 2007).
Out of a possible 160 students, 107 elected to
participate in the survey, of which 105 valid responses were returned. Of the 105 valid responses,



Privacy Control and Assurance

Table 1. Factor analysis


Factor Loading
Variable

Factor 1: Concern about privacy practice


(A)

.86

(B)

.83

(C)

.83

(D)

.82

(E)

.82

(F)

.67

.39

Factor 2: Concern about privacy disclosure


(G)

.90

(H)

.88

(I)

.85

(J)

.85

(K)

.82

(L)

.73

Factor 3: Concern about giving out personal data


(M)

.84

(N)

.74

(O)

.74

(P)

.70

Factor 4: Active control


(Q)

.86

(R)

.84

(S)

.60

(T)

.58

Factor 5: Passive control


(U)

.78

(V)

.77

(W)

.56

(X)

.49

Variance explained
Cronbachs alpha



24.07%

17.89%

10.02%

7.85%

6.90%

.90

.92

.84

.70

.62

Privacy Control and Assurance

eight did not indicate gender in the demographics


section and have been removed for the purposes
of this study. The 97 remaining responses were
entered in the database for further analyses, and
are the basis for this study and ensuing discussion.
The response rate was 66.88%.

Analysis and results


Of the 97 responses, 71 participants were male
(73.2%) and 26 participants were female (26.8%).
Most respondents were between the ages of 16
and 24. Of them, 55.2% were whites, 21.9%
were Asians, 6.7% were African-Americans, and
4.8% were Hispanics. Furthermore, 53.3% of the
participants were full-time students, 38.1% were
employed, and 1% was retired.
Questions regarding privacy concerns and the
users ability to control the environment were factor analyzed with principal component extraction.
The exploratory factor analysis with an orthogonal
(varimax) rotation yielded factors with multiple
loadings. With eigenvalues of 1.0 and scree plots
as the criteria, we found a presence of five factors. Items with factor loadings less than 3.0 were
excluded from further analyses. The results of the
factor analysis are shown in Table 1.
Factors one and three were split from a larger
section of questions in which respondents were
asked how concerned they were about (1) current
privacy practice on the Internet and (2) giving
out personal information. The results of factor
analysis show two separate factors: concern about
privacy practice (variance explained: 24.07%;
Cronbachs alpha: .90) and concern about giving
out personal data (variance explained: 10.02%;
Cronbachs alpha: .84).
The second factor reflects the respondents
feelings about giving their personal data when
e-commerce sites are supplemented with privacy
components, such as privacy statements, privacy
logos, and privacy endorsements from third parties. Therefore, the factor is named concern about
privacy disclosure. This factor explains 17.89%
of the variance and its Cronbachs alpha is .92.

The fourth factor is comprised of items such as


deleting browser cookies, abandoning spammed
e-mail accounts, and faking personal information
for online registration. Although these items are
related to dealing with the unwanted presence
of others, they exhibit the users intention to be
actively involved in preventing intrusion into
personal online privacy. Users combat such
intrusion with active actions, such as information falsification and multiple identities. Thus,
the factor is named active control. This factor
explains 7.85% of the variance and its Cronbachs
alpha is .70.
Four items loaded on the last factor clearly represent users reactions to the unwanted presence of
others. In public accessible media, such as e-mail
and chat rooms, it is difficult for users to prevent
others from contacting them or automatically collecting personal information. One good strategy
of blocking such an unwanted presence is to just
ignore it. This factor is named passive control
because items clustered into it were related to users control in a passive way. This factor explains
6.90% of the variance. Cronbachs alpha for this
factor is .62. Traditionally, an instrument is considered sufficiently reliable when its Cronbachs
alpha is .70 or above. However, Hair, Anderson,
Tatham, and Black (1998) indicate that Cronbachs
alpha may decrease to .60 in exploratory studies.
Because of the exploratory nature of the current
study, this factor is considered valid for further
analysis. Factor loadings of these five factors were
saved for later analyses.
ANOVA was conducted to examine the differences between the two genders in their reactions
to the five privacy factors. The results in Table 3
show that two variables (concern about privacy
disclosure and active control) were significantly
related to the gender variable. Both Table 2
and Table 3 indicate that male users were more
concerned about active control of the unwanted
presence of others, while female users were more
concerned about privacy disclosure. Gender differences were not found in the rest of variables:



Privacy Control and Assurance

Table 2. Means and S.D.five privacy factors


N

(1) Concern about privacy practice

(2) Concern about privacy disclosure

(3) Concern about giving out personal data

(4) Active control

(5) Passive control

Mean

Std.
Deviation

Std.
Error

95% Confidence
Interval for Mean
Lower
Bound

Upper
Bound

Min.

Max.

Female

26

-.08

1.20

.24

-.57

.40

-4.22

1.30

Male

71

.02

.93

.11

-.20

.24

-3.56

1.11

Total

97

-.05

1.00

.10

-.21

.20

-4.23

1.30

Female

26

.38

.75

.15

.08

.69

-1.30

1.36

Male

71

-.15

1.05

.12

-.40

.10

-2.22

1.48

Total

97

-.08

1.00

.10

-.21

.19

-2.22

1.48

Female

26

-.21

1.11

.22

-.66

.24

-2.64

1.31

Male

71

.07

.96

.11

-.16

.30

-2.60

1.65

Total

97

-.05

1.00

.10

-.21

.20

-2.64

1.65

Female

26

-.33

.96

.19

-.72

.05

-1.82

1.51

Male

71

.14

.99

.12

-.09

.37

-2.15

1.90

Total

97

.01

.99

.10

-.19

.21

-2.15

1.90

Female

26

.25

.77

.15

-.06

.56

-1.48

1.50

Male

71

-.06

1.03

.12

-.31

.18

-3.91

1.81

Total

97

.02

.98

.1

-.18

.22

-3.91

1.81

Table 3. ANOVAfive privacy factors

(1) Concern about privacy practice

(2) Concern about privacy disclosure

(3) Concern about giving out personal data

(4) Active control

(5) Passive control



Sum of
Squares

df

Mean
Square

Sig.

Between groups

.221

.22

.22

.64

Within groups

96.55

95

1.02

Total

96.77

96

Between groups

5.45

5.45

5.69

.02

Within groups

90.91

95

.96

Total

96.36

96
1.47

.23

4.45

.04

1.952

.17

Between groups

1.47

1.47

Within groups

95.28

95

1.00

Total

96.75

96

Between groups

4.28

4.28

Within groups

91.30

95

.96

Total

95.58

96

Between groups

1.86

1.86

Within groups

90.59

95

.95

Total

92.45

96

Privacy Control and Assurance

passive control, concern about giving personal


data, and concern about privacy practice.

discussion and recommendations


The first hypothesis predicts that females are
more concerned about privacy practice online
(Factor 1). With a p-value greater than .05, this
is not supported in the current study (p = .64).
The second hypothesis conjectures that females
are less likely to share their personal data with
online vendors (Factor 3). The result of ANOVA
suggests this hypothesis should also be rejected (p
= .23). There was no statistical difference between
the two gender groups in their concerns about
giving out personal data. However, the means of
variables clustered into the same factor range from
5.59 to 6.19 for females, and from 5.73 to 6.41 for
males, indicating that both gender groups were
very concerned about giving out their personal
data on the Internet. E-businesses should take this
concern into account and work toward alleviating
this condition through privacy assurances.
Concerns about disclosure of privacy procedures (Factor 2) were conjectured in the third
hypothesis. The results from Table 3 support this
hypothesis (p = .02). Females appear to be more
concerned about whether e-businesses disclose
their privacy procedures. As the results suggest,
females pay more attention to online privacy than
males. The Watched study also found similar
gender-related privacy concerns (Friedman et al.,
2006). E-businesses that disclose their privacy
practices in various modes, such as showing a
privacy statement, displaying a third-party privacy
logo, and creating alliances with other well-known
sites (Hu et al., 2003; Noteberg et al., 2003; Kim et
al., 2004; Aiken & Boush, 2005), are very likely
to draw attention from female users. It is likely
that privacy issues and practices are beginning to
emerge as a strong influencer of e-commerce
success. Privacy is an important measure by
those who are sensitive to their physical or virtual
environments. Females are known to possess a

high environmental sensitivity (Meyers-Levy &


Maheswaran, 1991) and will be the first to spot
the importance of privacy practices within their
virtual environments.
Trust in e-privacy practices has been cited as
one cause of low consumer involvement in business-to-consumer electronic commerce (Hoffman
et al., 1999; Phelps et al., 2000; Eastlick et al.,
2006; Roman, 2007). Our present study supports
this assertion because female users are most likely
to be influenced by various privacy procedure
disclosures. Therefore, to foster trust and enhance
consumer perception of privacy protection, ebusinesses are advised to make their privacy
procedures visible and understandable.
Hypotheses four and five predict that females
are more likely to adopt passive control mechanisms (Factor 5) to block the unwanted presence
of others, while males are more likely to take a
proactive approach (Factor 4) to prevent such a
presence. Hypothesis four was not supported (p =
.17), but hypothesis five was (p = .04). As evident
in existing literature, males are likely to exercise
control over messages and the communication medium. Unsolicited marketing e-mail and Web user
tracking components, such as cookies and Web
bugs, can be ineffective because male users will
control and filter out these unwanted messages.
As a result, business intelligence resulting from
mining the data collected from such user-tracking
software can be biased and distort online marketing decisions. E-businesses should be cautioned
to carefully interpret the mined data, which may
be subverted by male users online behavior.

limitations of the study


Our study empirically assessed an Internet users
ability to control his or her private information
and its relationship with the five types of privacy
assurance techniques. However, due to its early
assessment of privacy controls and its empirical
nature, this study is subject to limitations.



8

E-businesses should never share personal information with other


companies without proper authorization.

E-businesses should never sell the personal information in their


computer database to other companies.

E-businesses should take more steps to make sure that unauthorized


people cannot access personal information in their computers.

E-businesses should devote more time and effort to preventing


unauthorized access to personal information.

When people give personal information to an e-business for some


reason, the e-business should never use the information for any other
reason.

(B)

(C)

(D)

(E)

(F)

(M)

It usually bothers me when e-businesses ask me for personal


information.

Factor 3: Concern about giving out personal data

I feel more comfortable submitting personal information to sites that


have a brick-and-mortar counterpart that I can shop in.

I feel more comfortable submitting personal information to sites that


have a statement that guarantees protection of personal information.

(J)

(L)

I feel more comfortable submitting personal information to sites that


have a privacy logo endorsed by another site.

(I)

I feel more comfortable submitting personal information to sites that


have a privacy statement.

I feel more comfortable submitting personal information to sites that


have established themselves with other well-known companies.

(H)

(K)

I feel more comfortable submitting personal information to sites that


have established themselves as good e-commerce sites.

(G)

Factor 2: Concern about privacy disclosure

Computer databases that contain personal information should be


protected from unauthorized access.

(A)

Factor 1: Concern about privacy practice

Variable

Strongly
disagree

Somewhat
disagree

Disagree

Neutral

Agree

Somewhat
agree

Strongly
agree

Privacy Control and Assurance

Table 4. Survey instrument

continued on following page

It bothers me to give personal information to so many e-businesses.

I am concerned that e-businesses are collecting too much personal


information about me.

(O)

(P)

When visiting a Web page that requires registration, I fake my


personal information to obtain access.

I use several email accounts for privacy reasons.

I know how to delete cookies from my computer.

(R)

(S)

(T)

I do not answer to unsolicited telemarketing calls.

I do not respond to unsolicited email.

I block Web browsers from receiving cookies.

I ignore chat requests from people that I dont know.

(U)

(V)

(W)

(X)

Factor 5: Passive control

When downloading software from a Web site that requires


registration, I fake my personal information to obtain the software.

(Q)

Factor 4: Active control

When e-businesses ask me for personal information, I sometimes


think twice before providing it.

(N)

Privacy Control and Assurance

Table 4. continued

9

Privacy Control and Assurance

As noted previously, the current study sample


may not be entirely representative since students
were recruited to participate in the survey. However, the responses of these Web-savvy users may
indicate that much needs to be done to facilitate
effective e-business transactions. The sampling
issue may also influence the generalizability of
the findings. However, the differences found in
the relationship between privacy controls and user
concerns from this somewhat homogeneous group
of respondents may suggest that more dramatic differences could be expected from a broader sample
involving randomly selected participants.
Finally, this study does not address psychological and attitudinal factors, which may be avenues
for future exploration within the online privacy
context. Even with these constraints, the current study nonetheless provides preliminary, yet
useful, insights to the body of privacy research.
Moreover, it can be situated with fellow researchers from diverse disciplines who are currently
examining various factors in other exploratory
studies (Garbarino & Strahilevitz, 2004; Royal,
2005; Chesley, 2006; Kang & Yang, 2006; Fraser
& Henry, 2007; Hupfer & Detlor, 2007; Yao et
al., 2007).

future trends
The ongoing tension between the e-business need
for personal information and user willingness
to provide personal information in exchange for
goods and services will increasingly demand our
attention. As more information becomes available
online for individual use, more information is
collected by those offering the services. Consider
Internet behemoths, such as Google, that offer a
powerful search engine, e-mail, map searches
(including actual street views), word processing
software, and various other tools (Google, 2007a),
all online for free access and use by anyone.
However, these features may come at a price of
which most users are unaware: privacy. Whether

80

a user remains logged into her Google account or


keeps her Google Toolbar activated in her Web
browser (Google, 2007b), information is collected
at granular levels and saved for months or years on
Googles systems (OBrien & Crampton, 2007).
Subverting this process requires extensive technical knowledge and will limit many of the proffered
Google features. Here we have the crux of the
issue: features versus privacy. To compound the
issue, experts have labeled Google as the worst
offender in terms of its privacy practices (CNN,
2007). Coupled with a pending acquisition of
DoubleClick by Google, we could see serious
privacy infringements via the amount and type
of personal information collected online (EPIC,
2007).
Google is not the only e-business in this
scenario of services versus privacy, but it is one
of the most prevalent. As more commodities are
offered online, more will be asked of users. Both
e-businesses and consumers must work together
to find the middle ground between how much
information can be collected, retained, and used,
and what users are willing to provide in exchange
for the goods and services.
One avenue towards this middle ground may
already be available to e-businesses: a rebirth
and reintegration of the W3Cs P3P specification.
Although P3P is currently suspended, it would
take little effort to incorporate this specification
into current Web browsers via plug-in technology.
Additional initiatives based on P3P are already in
the works, such as the Policy Aware Web (2006)
and Prime (2007).
No matter what the solution, without a dialogue about these issues, users will continue to
find both active and passive means to subvert
data collection techniques. As a result, e-businesses will continue to work to find new ways to
collect data, sometimes without users consent.
We suggest the resources and time necessary for
an acceptable compromise would be better spent
looking for solutions together rather than fighting
for control of information.

Privacy Control and Assurance

conclusIon
Our study suggests that both female and male
users implement controls when utilizing online
services in the current e-business environment.
Males use multiple identities and techniques to
actively thwart data collection, whereas females
passively ignore information requests or altogether forgo participation. Whatever the case, if
e-businesses want to collect viable data in order
to improve online offerings and remain competitive, they must (1) implement an accessible and
easy-to-read privacy statement and (2) obtain
endorsement from well-known privacy groups
such as the BBBOnLine (BBBOnLine, 2007) and
TRUSTe (TRUSTe, 2007), as well as prominently
display the resulting certification logo. These two
items are the foundation upon which an e-business
can begin to form the initial trusting relationship
between itself and users.
Without trust between e-business and consumer, there can be no productive relationship
(Duffy, 2005; Flavin & Guinalu, 2006; Roman,
2007). Consumers will share the required information if they understand and agree with the privacy
policies, as well as trust that an e-business will
only use their personal data to better the available personalized offerings. In the information
age, most consumers are bombarded with useless
information on a daily basis; customized offerings are a most welcome change. However, if the
trust is breeched, an e-business will experience an
uphill battle to regain relationships with existing
customers and acquire new consumers.

future reseArch dIrectIons


Our study linking gender to attitudes toward online privacy is promising. E-businesses should take
note that female users tend to build relationships
with organizations that disclose their privacy policies (hypothesis three). E-businesses, particularly
those seeking to maximize their female customer

base, should consider meeting the requirements


of privacy and business organizations, such as
TRUSTe (TRUSTe, 2007) and the BBBonline
(BBBOnLine, 2007), thereby enabling the display
of the third-party trustmark logo on their Web
sites. Future research should look to help e-businesses determine the correct course of action to
maximize initial relationship formations.
Male users taking an active approach to privacy
protection (hypothesis five) suggests another track
of research. E-businesses that rely solely on Web
bugs, cookies, banner ads, and click streams are
losing valuable data because of the active control
of information male users employ during their
Web surfing forays. As a result, mined data might
be skewed in directions unfavorable to effective
marketing applications.
In recent years, Internet computing has experienced a major interaction shift with the influx
of social computing applications, such as social
networking sites like Facebook (Facebook, 2007)
and MySpace (MySpace, 2007), and the influx of
Web 2.0 services, such as Google Docs (Google,
2007c). Current calls for cross-disciplinary research (Parameswaran & Whinston, 2007) must
be heeded as we look to the changing social dynamics within online social networks (Jenkins &
Boyd, 2006) and how they affect gender-related
perceptions of privacy. For example, recent selfdocumented Facebook incidents (Chang, 2006,
Leitzey, 2006, Stapleton-Paff, 2007; Stokes, 2007)
of collegiate underage drinking and criminal
mischief are no longer uncommon. Both male
and female students share more of what used
to be considered personal within these public
spaces. Researchers must examine if this shift
translates into changing attitudes (gender-related or otherwise) toward all forms of control
of personal information, especially within the
e-business realm.
Our current study is designed to focus on
gender differences in several privacy issues. It
is likely that further insights can be discovered
when other user characteristics, such as computer

8

Privacy Control and Assurance

and Internet experience, culture, learning styles


and other types of online preferences, and demographic data, such as age, job type, and education,
are included. When these other factors are taken
into consideration, gender may or may not be the
sole factor that explains user privacy concerns.
Ultimately, our study suggests there are rich veins
of research yet to be mined. Online privacy issues
are in their infancy. As technology advances and
more people use the Internet, demographical studies concerning users view of online privacy and
how e-business can remain effective, yet maintain
trusting relationships, are essential.

BBBOnLine. (2007). Retrieved December 9, 2007,


from http://www.bbbonline.org/

references

Campbell, J., & Carlson, M. (2002). Panopticon.


com: Online surveillance and the commodification
of privacy. Journal of Broadcasting & Electronic
Media, 46(4), 586-606.

Ackerman, M. S., & Lorrie F. C. (1999). Privacy


critics: UI components to safeguard users privacy. In Proceedings of the ACM Conference on
Human Factors in Computing Systems (CHI99)
(pp. 258-259). Pittsburgh, PA.
Aiken, K., & Boush, D. (2006). Trustmarks, objective-source ratings, and implied investments
in advertising: Investigating online trust and
the context-specific nature of internet signals.
Academy of Marketing Science Journal, 34(3),
308-323.
Allen, A. (2000). Gender and privacy in cyberspace. Stanford Law Review, 52(5), 1175-1200.
Apple. (2007). iTunes [software]. Retrieved
December 9, 2007, from http://www.apple.com/
itunes/
Arbaugh, J. B. (2000). An exploratory study of
effects of gender on student learning and class
participation in an internet-based MBA course.
Management Learning, 31(4), 503-519.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy
policies: An empirical perspective on self-regulatory practices. Journal of Electronic Commerce
in Organizations, 3(4), 61-74.

8

Berendt, B., Gnther, O., & Spiekermann, S.


(2005). Privacy in e-commerce: Stated preferences vs. actual behavior. Communications of
the ACM, 48(4), 101-106.
Borland, J. (2006). Apples iTunes raises privacy
concerns. CNet News. Retrieved July 7, 2007, from
http://news.com.com/Apples+iTunes+raises+priv
acy+concerns/2100-1029_3-6026542.html
Brunner, C. (1991). Gender and distance learning.
Annals of the American Academy of Political and
Social Science, 133-145.

Chang, J. (2006). Is Facebook private? Northwestern Chronicle. Retrieved October 2, 2007,


from http://www.chron.org/tools/viewart.
php?artid=1346
Chen, K., & Rea, A. (2004). Protecting personal
information online: A survey of user privacy concerns and control techniques. Journal of Computer
Information Systems, 44(4), 85-92.
Chen, Y., & Barnes, S. (2007). Initial trust and
online buyer behaviour. Industrial Management
& Data Systems, 107(1), 21-36.
Chesley, N. (2006). Families in a high-tech age:
Technology usage patterns, work and family
correlates, and gender. Journal of Family Issues,
27(5), 587-608.
Chua, S. L., Chen, D.T., & Wong, A. F. L. (1999).
Computer anxiety and its correlates: A metaanalysis. Computers in Human Behaviors, 15,
609-623.
Clarke, R. (1999). Internet privacy concerns confirm the case for intervention. Communications
of the ACM, 42(2), 60-67.

Privacy Control and Assurance

CNN. (2007). Google privacy Worst on the


Web. CNN.com. Retrieved July 10, 2007, from
http://www.cnn.com/2007/TECH/internet/06/11/
google.privacy.ap/
DAstous, A. (2000). Irritating aspects of the shopping environment. Journal of Business Research,
49, 149-156.
Duffy, D. (2005). The evolution of customer loyalty strategy. The Journal of Consumer Marketing,
22(4/5), 284-286.
Eastlick, M., Lotz, S., & Warrington, P. (2006).
Understanding online B-to-C relationships: An
integrated model of privacy concerns, trust, and
commitment. Journal of Business Research,
59(8), 877-886.
Electronic Privacy Information Center (EPIC).
(1997). Surfer beware: Personal privacy and the
internet. Washington, D.C.: Electronic Privacy
Information Center. Retrieved July 7, 2007, from
http://www.epic.org/reports/surfer-beware.html
Electronic Privacy Information Center (EPIC).
(2007). Proposed Google/DoubleClick deal.
Washington, D.C.: Electronic Privacy Information
Center. Retrieved July 7, 2007, from http://www.
epic.org/privacy/ftc/google/
Facebook. (2007). Retrieved December 9, 2007,
from http://www.facebook.com/
Festa, P. (2002). Windows media aware of DVDs
watched. CNet News. Retrieved May 28, 2007,
from http://news.com.com/2100-1023-841766.
html
Fischer, K. (2007). Apple hides account info in
DRM-free music, too. Ars Technica. Retrieved
July 8, 2007, from http://arstechnica.com/news.
ars/post/20070530-apple-hides-account-info-indrm-free-music-too.html
Flavin, C., & Guinalu, M. (2006). Consumer
trust, perceived security and privacy policy: Three

basic elements of loyalty to a web site. Industrial


Management & Data Systems, 106(5), 601-620.
Fram, E. H., & Grady, D. B. (1997). Internet
shoppers: Is there a surfer gender gap? Direct
Marketing, January, 46-50.
Fraser, S., & Henry, L. (2007). An exploratory
study of residential internet shopping in Barbados. Journal of Eastern Caribbean Studies,
32(1), 1-20, 93.
Friedman, B., Kahn, P., Hagman, J., Severson, R.,
& Gill, B. (2006). The watcher and the watched:
Social judgments about privacy in a public place.
Human-Computer Interaction, 21(2), 235-272.
Garbarino, E., & Strahilevitz, M. (2004). Gender
differences in the perceived risk of buying online
and the effects of receiving a site recommendation.
Journal of Business Research, 57(7), 768-775.
Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An
extension to the technology acceptance model.
MIS Quarterly, 21(4), 389-400.
Goodwin, C. (1991). Privacy: Recognition of
a consumer right. Journal of Public Policy &
Marketing, 10(1), 149-166.
Google. (2007a). Retrieved December 9, 2007,
from http://www.google.com/
Google. (2007b). Retrieved December 9, 2007,
from http://toolbar.google.com/
Google. (2007c). Retrieved December 9, 2007,
from http://documents.google.com/
Gumpert, G., & Drucker, S. (1998). The demise
of privacy in a private world: From front porches
to chat rooms. Communication Theory, 8(4),
408-425.
Ha, Y., & Stoel, L. (2004). Internet apparel
shopping behaviors: The influence of general innovativeness. International Journal of Retail &
Distribution Management, 32(8/9), 377-385.

8

Privacy Control and Assurance

Hair, J. F., Jr., Anderson, R. E., Tatham, R. L., &


Black, W. C. (1998). Multivariate data analysis
(5th ed.). UpperSaddle River, NJ: Prentice Hall.
Han, P., & Maclaurin, A. (2002). Do consumers really care about online privacy? Marketing
Management, 11(1), 35-38.
Hemphill, T. (2002). Electronic commerce and
consumer privacy: Establishing online trust in
the U.S. digital economy. Business and Society
Review, 107(2), 221-239.
Herring, S. C. (1992). Gender and participation
in computer-mediated linguistic discourse. Washington, DC: ERIC Clearinghouse on Languages
and Linguistics, Document no. ED 345552.
Hoffman, D. L., Novak, T. P., & Peralta, M. (1999).
Building consumer trust online. Communications
of the ACM, 42(4), 80-85.
Hu, X., Lin, Z., & Zhang, H. (2003). Trust promoting seals in electronic markets: An exploratory study of their effectiveness for online sales
promotion. Journal of Promotion Management,
9(1/2), 163-180.
Hui, K. L., Tan, B., & Goh, C. Y. (2006). Online
information disclosure: Motivators and measurements. ACM Transactions on Internet Technology,
6(4), 415-441.
Hupfer, M., & Detlor, B. (2007). Beyond gender
differences: Self-concept orientation and relationship-building applications on the internet. Journal
of Business Research, 60(6), 613-619.
Janda, S., Trocchia, P., & Gwinner, K. (2002).
Consumer perceptions of internet retail service
quality. International Journal of Service Industry
Management, 13(5), 412-431.
Jenkins, H., & Boyd, D. (2006). MySpace and
deleting online predators act (DOPA). MIT Tech
Talk. Retrieved October 12, 2007, from http://web.
mit.edu/cms/People/henry3/MySpace.pdf

8

Kang, H., & Yang, H. (2006). The visual characteristics of avatars in computer-mediated communication: Comparison of internet relay chat
and instant messenger as of 2003. International
Journal of Human-Computer Studies, 64(12),
1173-1183.
Kilbourne, W., & Weeks, S. (1997). A socio-economic perspective on gender bias in technology.
Journal of Socio-Economics, 26(1), 243-260.
Kim, D. J., Steinfield, C., & Lai, Y. (2004). Revisiting the role of web assurance seals in consumer
trust. In M. Janssen, H. G. Sol, & R. W. Wagenaar (Eds.), Proceedings of the 6th International
Conference on Electronic Commerce (ICEC 04,
60) (pp. 280-287). Delft, The Netherlands: ACM
Press.
King, J., Bond, T., & Blandford, S. (2002). An
investigation of computer anxiety by gender
and grade. Computers in Human Behavior, 18,
69-84.
Krill, P. (2002). DoubleClick discontinues web
tracking service. ComputerWorld. Retrieved February 24, 2002, from http://www.computerworld.
com/storyba/0,4125,NAV47_STO67262,00.html
Large, A., Beheshti, J., & Rahman, T. (2002).
Gender differences in collaborative web searching
behavior: An elementary school study. Information Processing & Management, 38, 427-443.
Lauer, T., & Deng, X. (2007). Building online trust
through privacy practices. International Journal
of Information Security, 6(5), 323-331.
Leitzsey, M. (2006). Facebook can cause problems
for students. Online PacerTimes. Retrieved October 2, 2007, from http://media.www.pacertimes.
com/media/storage/paper795/news/2006/01/31/
News/Facebook.Can.Cause.Problems.For.Students-1545539.shtml
Levy, S., & Gutwin, C. (2005). Security through
the eyes of users. In Proceedings of the 14th In-

Privacy Control and Assurance

ternational Conference on the World Wide Web


(pp. 480-488). Chiba, Japan.
McElhearn, K. (2006). iSpy: More on the iTunes
MiniStore and privacy. Kirkville. Retrieved July
7, 2007, from http://www.mcelhearn.com/article.
php?story=20060112175208864

Nachimias, R., Mioduser, D., & Shelma, A. (2001).


Information and communication technologies
usage by students in an Israeli high school: Equity, gender, and inside/outside school learning
issues. Education and Information Technologies,
6(1), 43-53.

McKnight, D. H., Choudhury, V., & Kacmar, C.


(2000). Trust in e-commerce vendors: A two-stage
model. In Proceedings of the Association for Information Systems (pp. 532-536). Atlanta, GA.

Nam, C., Song, C., Lee, E., & Park, C. (2005).


Consumers privacy concerns and willingness
to provide marketing-related personal information online. Advances in Consumer Research,
33, 212-217.

Metzger, M. (2006). Effects of site, vendor, and


consumer characteristics on web site trust and
disclosure. Communication Research, 33(3),
155-179.

Noteberg, A., Christiaanse, E., & Wallage, P.


(2003). Consumer trust in electronic channels.
E-Service Journal, 2(2), 46-67.

Meyers-Levy, J. & Maheswaran, D. (1991). Exploring differences in males and females processing strategies. Journal of Consumer Research,
18(June), 63-70.
Microsoft. (2003). Windows media player 9 series
privacy settings. Retrieved July 7, 2007, from
http://www.microsoft.com/windows/windowsmedia/player/9series/privacy.aspx
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online
retailer disclosures. Journal of Public Policy &
Marketing, 19(Spring), 54-61.

Nowak, G. J., & Phelps, J. (1992). Understanding


privacy concerns: An assessment of consumers
information related knowledge and beliefs. Journal of Direct Marketing, 6(Autumn), 28-39.
OBrien, K., & Crampton, T. (2007). EU asks
Google to explain data retention policies. International HeraldTribune. Retrieved July 10, 2007,
from http://www.iht.com/articles/2007/05/25/
business/google.php
Oakes, C. (1999). Mouse pointer records clicks.
Wired News. Retrieved June 28, 2006, from http://
www.wired.com/news/print/0,1294,32788,00.
html

Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. Journal of Consumer Affairs,
35(1), 27-44.

Ono, H., & Zavodny, M. (2003). Gender and


the internet. Social Science Quarterly, 84(1),
111-121.

Moores, T. (2005). Do consumers understand the


role of privacy seals in e-commerce? Communications of the ACM, 48(3), 86-91.

Ono, H., & Zavodny, M. (2005). Gender differences in information technology usage: A U.S.Japan comparison. Sociological Perspectives,
48(1), 105-133.

Mukherjee, A., & Nath, P. (2007). Role of electronic trust in online retailing: A re-examination of
the commitment-trust theory. European Journal
of Marketing, 41(9/10), 1173-1202.

Pan, Y., & Zinkhan, G. (2006). Exploring the


impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.

MySpace. (2007). Retrieved December 9, 2007,


from http://www.myspace.com/

Papazafeiropoulou, A., & Pouloudi, A. (2001).


Social issues in electronic commerce: Implica-

8

Privacy Control and Assurance

tions for policy makers. Information Resources


Management Journal, 14(4), 24-32.
Parameswaran, M., & Whinston, A. (2007).
Research issues in social computing. Journal of
the Association for Information Systems, 8(6),
336-350.
Patton, M., & Josang, A. (2004). Technologies for
trust in electronic commerce. Electronic Commerce Research, 4(1-2), 9-21.
Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy
concerns and consumer willingness to provide
personal information. Journal of Public Policy
& Marketing, 19(1), 27-41.
Policy Aware Web. (2006). Retrieved July 10,
2007, from http://www.policyawareweb.org/
Privacy and Identity Management for Europe.
(Prime). (2007). Retrieved July 10, 2007, from
https://www.prime-project.eu/

In A. Andreasen, A. Simonson, & N. C. Smith


(Eds.), Proceedings of Marketing and Public
Policy (Vol. 8, pp. 73-79).Chicago: American
Marketing Association.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72(2), 131-148.
Roussous, G., & Moussouri, T. (2004). Consumer perceptions of privacy, security and trust
in ubiquitous commerce. Pers Ubiquit Comput,
8, 416-429.
Royal, C. (2005). A meta-analysis of journal articles intersecting issues of internet and gender.
Journal of Technical Writing and Communication,
35(4), 403-429.
Savicki, V., Kelley, M., & Lingenfelter, D. (1996).
Gender and small task group activity using computer-mediated communication. Computers in
Human Behavior, 12, 209-224.

Qutami, Y., & Abu-Jaber, M. (1997). Students


self-efficacy in computer skills as a function of
gender and cognitive learning style at Sultan
Qaboos University. International Journal of
Instructional Media, 24(1), 63-74.

Schoder, D., & Yin, P. (2000). Building firm trust


online. Communications of the ACM, 43(12),
73-79.

Radcliff, D. (2001). Giving users back their


privacy. ComputerWorld. Retrieved February
28, 2002, from http://www.computerworld.com/
storyba/0,4125,NAV47_STO61981,00.html

Schuman, E. (2006). Gartner: $2 billion in ecommerce sales lost because of security fears.
eWeek.com, November 27. Retrieved October
15, 2007, from http://www.eweek.com/article2/0,1895,2063979,00.asp

Riegelsberger, J., Sasse, M., & McCarthy, J. (2003).


Shiny happy people building trust?: Photos on
e-commerce websites and consumer trust. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (pp. 121-128). Ft.
Lauderdale, FL.
Roach, R. (2001). Internet usage reflects gender
breakdown. Black Issues in Higher Education,
18(11), 119.
Rohm, A. J., & Milne, G. R. (1998). Emerging
marketing and policy issues in electronic commerce: Attitudes and beliefs of internet users.

8

Sexton, R., Johnson, R., & Hignite, M. (2002).


Predicting internet/e-commerce use. Internet
Research, 12(5), 402-410.
Shalhoub, Z. (2006). Trust, privacy, and security in electronic business: The case of the GCC
countries. Information Management & Computer
Security, 14(3), 270-283.
Smith, H. J., Milberg, S. J., & Burke, S. J. (1996).
Information privacy: Measuring individuals
concerns about organizational practices. MIS
Quarterly, June, 167-196.

Privacy Control and Assurance

Smith, R. (2002a). Serious privacy problems in


windows media player for Windows XP. Retrieved
July 7, 2007, from http://www.computerbytesman.
com/privacy/wmp8dvd.htm
Smith, R. (2002b). Microsoft response to the windows media player 8 privacy advisory. Retrieved
July 7, 2007, from http://www.computerbytesman.
com/privacy/wmp8response.htm
So, M., & Sculli, D. (2002). The role of trust,
quality, value and risk in conducting e-business.
Industrial Management & Data Systems, 102(8/9),
503-512.
Stapleton-Paff, K. (2007). Facebook poses
privacy issues for students. The Daily of the
University of Washington. Retrieved October
2, 2007, from http://www.thedaily.washington.
edu/article/2007/4/25/facebookPosesPrivacyIssuesForStudents
Stokes, M. (2007). Police charge three female
students for sign stealing. Western Herald. Retrieved October 22, 2007 from http://media.www.
westernherald.com/media/storage/paper881/
news/2007/10/22/News/Police.Charge.Three.Female.Students.For.Sign.Stealing-3046592.shtml
Tavani, H. (1999). Privacy online. Computers and
Society, 29(4), 11-19.
Tavani, H., & Moor, J. (2001). Privacy protection,
control of information, and privacy-enhancing
technologies. Computers and Society, 31(1), 611.

TRUSTe. (2007). Retrieved December 9, 2007,


from http://www.truste.org/
Van Dyke, T., Midha, V., & Nemati, H. (2007).
The effect of consumer privacy empowerment
on trust and privacy concerns in e-commerce.
Electronic Markets, 17(1), 68-81.
Wang, H., Lee, M., & Wang, C. (1998). Consumer
privacy concerns about internet marketing. Communications of the ACM, 41(3), 63-70.
Wang, P., & Petrison, L. A. (1993). Direct marketing activities and personal privacy. Journal of
Direct Marketing, 7(Winter), 7-19.
WebTrends. (2007). Retrieved December 9, 2007,
from http://www.webtrends.com/
Weible, R. J. (1993). Privacy and data: An empirical study of the influence and types and data
and situational context upon privacy perceptions.
(Doctoral Dissertation, Department of Business
Administration, Mississippi State University).
Dissertation Abstracts International, 54(06).
World Wide Web Consortium (W3C). (2006).
The platform for privacy preferences 1.1 (P3P1.1)
specification. World Wide Web Consortium.
Retrieved July 7, 2007, from http://www.w3.org/
TR/P3P11/
World Wide Web Consortium (W3C). (2007). The
platform for privacy preferences (P3P) project.
World Wide Web Consortium. Retrieved July 5,
2007, from http://www.w3.org/P3P/

Teo, T., & Lim, V. (1997). Usage patterns and


perceptions of the internet: The gender gap. Equal
Opportunities International, 16(6/7), 1-8.

Yao, M., Rice, R., & Wallis, K. (2007). Predicting user concerns about online privacy. Journal
of the American Society for Information Science
and Technology, 58(5), 710-722.

Teo, T. S. H., Lim, V. K. G., & Lai, R. Y. C.


(1999). Intrinsic and extrinsic motivation in internet usage. OMEGA: International Journal of
Management Science, 27, 25-37.

Yousafzai, S., Pallister, J., & Foxall, G. (2003). A


proposed model of e-trust for electronic banking.
Technovation, 23(11), 847-860.

8

Privacy Control and Assurance

AddItIonAl reAdIng
Antn, A., Bertino, E., Li, N., & Yu, T. (2007). A
roadmap for online privacy policy management.
Communications of the ACM, 50(7), 109-116.
Bonini, S., McKillop, K., & Mendonca, L. (2007).
The trust gap between consumers and corporations. The McKinsey Quarterly, 2, 7-10.
Boyens, C., Gnther, O., & Teltzrow, M. (2002).
Privacy conflicts in CRM services for online
shops: A case study. In Proceedings of the IEEE
International Conference on Privacy, Security,
and Data Mining Maebashi City, Japan, (Vol.
14, pp. 27-35).
Curtin, M. (2002). Developing trust: Online privacy and security. New York: Springer-Verlag.
Cvrcek, D., Kumpost, M., Matyas, V., & Danezis,
G. (2006). A study on the value of location privacy. In Proceedings of the 5th ACM Workshop
on Privacy in Electronic Society (pp. 109-118).
Alexandria, VA.
Drennan, J., Sullivan, G., & Previte, J. (2006).
Privacy, risk perception, and expert online behavior: An exploratory study of household end
users. Journal of Organizational and End User
Computing, 18(1), 1-22.
Earp, J., & Baumer, D. (2003). Innovative web
use to learn about consumer behavior and online
privacy. Communications of the ACM, 46(4),
81-83.
Electronic Frontier Foundation (EFF). (2007).
Privacy issues. Retrieved July 7, 2007, from
http://www.eff.org/Privacy/
Electronic Privacy Information Center (EPIC).
(2007). EPIC online guide to practical privacy
tools. Retrieved July 7, 2007, from http://www.
epic.org/privacy/tools.html
Frackman, A., Ray, C, & Martin, R. (2002). Internet and online privacy: A legal and business
guide. New York: ALM Publishing.
88

Freeman, L., & Peace, A. (Eds.). (2005). Information ethics: Privacy and intellectual property.
Hershey, PA: Information Science Publishing.
Golle, P. (2006). Revisiting the uniqueness of
simple demographics in the U.S. population. In
Workshop on Privacy in the Electronic Society:
Proceedings of the 5th ACM Workshop on Privacy
in the Electronic Society (pp. 77-80). Alexandria,
VA.
Gross, R., Acquisti, A., & Heinz, H. (2005).
Information revelation and privacy in online
social networks. In Workshop on Privacy in The
Electronic Society: Proceedings of the 2005 ACM
Workshop on Privacy in the Electronic Society
(pp. 71-80). Alexandria, VA.
Kauppinen, K., Kivimki, A., Era, T., & Robinson, M. (1998). Producing identity in collaborative virtual environments. In Proceedings of the
ACM Symposium on Virtual Reality Software and
Technology (pp. 35-42). Taipei, Taiwan.
Khalil, A., & Connelly, K. (2006). Context-aware
telephony: Privacy preferences and sharing patterns. In Proceedings of the 2006 20th Anniversary
Conference on Computer Supported Cooperative
Work (pp. 469-478). Banff, Alberta, Canada.
Landesberg, M., Levin, T., Curtain G, & Lev,
O. (1998). Privacy online: A report to congress.
Federal Trade Commission. Retrieved July 7,
2007, from http://www.ftc.gov/reports/privacy3/
toc.shtm
Lumsden, J., & MacKay, L. (2006). How does
personality affect trust in B2C e-commerce?
In ACM International Conference Proceeding
Series, Vol. 156: Proceedings of the 8th International Conference on Electronic Commerce: The
New E-Commerce: Innovations for Conquering
Current Barriers, Obstacles and Limitations to
Conducting Successful Business on the Internet.
(pp. 471-481). Fredericton, New Brunswick,
Canada.

Privacy Control and Assurance

McCloskey, D. W. (2006). The importance of ease


of use, usefulness, and trust to online consumers:
An examination of the technology acceptance
model with older customers. Journal of Organizational and End User Computing, 18(3), 47-65.
Novak, J., Raghavan, P., & Tomkins, A. (2004).
Anti-aliasing on the web. In Proceedings of the
13th International Conference on World Wide
Web (pp. 30-39). New York.
Peters, T. (1999). Computerized monitoring and
online privacy. Jefferson, NC: McFarland &
Co.

Reagle, J., & Cranor, L. F. (1999). The platform


for privacy preferences. Communications of the
ACM, 42(2), 48-55
Tan, F. B., & Sutherland, P. (2004). Online consumer trust: A multi-dimensional model. Journal
of Electronic Commerce in Organizations, 2(3),
40-58.
Walters, G. J. (2001). Privacy and security: An
ethical analysis. SIGCAS Computer Soc., 31(2),
8-23.
Yee, G. (2006). Privacy protection for e-services.
Hershey, PA: IGI Publishing.

Ratnasingam, P. (2003). Inter-organizational trust


for business to business e-commerce. Hershey,
PA: IRM Press.

89

90

Chapter IX

A Profile of the Demographics,


Psychological Predispositions,
and Social/Behavioral
Patterns of Computer Hacker
Insiders and Outsiders
Bernadette H. Schell
University of Ontario Institute of Technology, Canada
Thomas J. Holt
The University of North Carolina at Charlotte, USA

AbstrAct
This chapter looks at the literaturemyths and realitiessurrounding the demographics, psychological
predispositions, and social/behavioral patterns of computer hackers, to better understand the harms that
can be caused to targeted persons and property by online breaches. The authors suggest that a number
of prevailing theories regarding those in the computer underground (CU)such as those espoused by
the psychosexual theoristsmay be less accurate than theories based on gender role socialization, given
recent empirical studies designed to better understand those in the CU and why they engage in hacking
and cracking activities. The authors conclude the chapter by maintaining that online breaches and online
concerns regarding privacy, security, and trust will require much more complex solutions than currently
exist, and that teams of experts in psychology, criminology, law, and information technology security
need to collaborate to bring about more effective real-world solutions for the virtual world.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

IntroductIon
Hackers are the elite corps of computer designers
and programmers. They like to see themselves
as the wizards and warriors of tech. Designing
software and inventing algorithms can involve
bravura intellection, and tinkering with them is as
much fun as fiddling with engines. Hackers have
their own culture, their own language. And in the
off-hours, they can turn their ingenuity to sparring
with enemies on the Net, or to the midnight stroll
through systems you should not be able to enter,
were you not so very clever. Dark-side hackers, or
crackers, slip into systems for the smash-and-grab,
but most hackers are in it for the virtuoso ingress.
It is a high-stress life, but it can be amazing fun.
Imagine being paidwell paidto play forever
with the toys you love. Imagine. St. Jude, Mondo
2000: Users Guide to the New Edge
Since its appearance in the United States in
the second part of the twentieth century, the Internet has been the topic of arduous study from
a number of academic disciplines, including the
social sciences and criminology, business, law,
computer science, and political science. In recent
decades, as the Internet has expanded at unprecedented rates, and with different socio-economic
interests becoming increasingly involved, the
Internets impact on global citizens daily lives
has been profound. The Internet has become one
of the most important ways of communicating
internationally in real time (such as is the case
with online activismknown in the information technology field as hacktivism). Also, the
complex infrastructure of the Internet has on
the positive side facilitated a number of common
activitiessuch as e-commerce, Internet banking, online gaming, and online votingand has
provided a more level political and economic
playing field for citizens residing in both developed and developing nations, particularly in
China, India, Russia, and Pakistan.

Moreover, in recent years in developed nations,


professionals have been able to broaden their
returns to society by adopting Internet-related
technologies. For example, using hand-held devices, doctors have been able to access patients
health histories and diagnostic records over the
Internet without having to rely on snail mail
courier services, and high-tech billionaires such
as those who started the Google search engine
(with a November, 2005, market cap of US$120
billion) have pushed the online entrepreneurial
envelope to a whole new higher and societalbeneficial plane (Schell, 2007).
However, with the growth of and diversity
in Internet traffic, a dark side has surfaced, particularly since the late 1980s as more and more
citizens have become able to afford personal
computers (PCs) and online accounts. Thus, techsavvy criminals have increasingly made use of
the Internet to perpetrate online crimescausing
an increase in incidences of online child exploitation, identity theft, intellectual property theft,
worm and virus infestations of business and home
computers, and online fraud involving its many
presentationse-commerce, voting, and gaming. Consequently, Internet-connected citizens
worldwide have become increasingly fearful that
their privacyincluding personal health histories,
banking transactions, social security numbers,
and online voting preferenceswould be vulnerable to destruction or alteration by mal-inclined
computer hackers (known as crackers). Too,
business leaders have become concerned that not
only will their computer networks be tampered by
tech-savvy outsiders but also by insider employees
determined to destroy critical business data when,
say, they leave the firm under less than happy
circumstances (Schell, 2007).
In the last decade, in particular, with the
growth of the Internet and electronic or e-commerce, the amount of personal information that
can potentially be collected about individuals by
corporations, financial and medical institutions,
and governments has also increased. Such data

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

collection, along with usage tracking and the


sharing of data with third partiesespecially in
light of the reality that such actions can easily be
accomplished through high-speed links and highcapacity storage devices without the consumers
expressed knowledge or consenthas raised a
multitude of issues among Internet users about
privacy (by definition, the state of being free from
unauthorized access), security (by definition, being protected from adversaries, particularly from
those who would do harm, even unintentionally,
to property or to a person or persons), and trust
(by definition, the element present in business
relationships when one partner willingly depends
on an exchanging partner in whom one has confidenceincluding online business exchanges)
(Grami & Schell, 2004).
Computer systems are regularly attacked to do
harm or for personal gain in todays wired and
wireless world (see Taylor, Caeti, Loper, Fritsch, &
Liederback, 2006). Such mal-inclined attacks are
often referred to as hacks but in the computer
underground (CU), they are known as cracks.
Both the positively-motivated, authorized hackers and the negatively-motivated, non-authorized
crackers have a profound interest in computers
and technology, and they like to use their knowledge to access computer systems (Schell, Dodge,
& Moutsatsos, 2002). Many in the general public
identify hackers as a primary threat to computer
security, and there is significant media attention
given to dramatic computer crimes attributed to
them (Furnell, 2002).
From a technical perspective, privacy issues
in the security sense regarding cracks of computer systems include digital rights management,
spam deterrence, anonymity maintenance, and
disclosure rule adequacy. Over the past few
years, a number of recorded cyber intruders into
companies and institutions computer networks
have violated the privacy rights of employees
and online registrants. In 2005, for example,
the media publicized evidence suggesting that
the Internet information brokerage industry

9

is poorly regulated, for on or about March 10,


cyber criminals stole passwords from legitimate
online users of as many 32,000 Americans in a
data-base owned by the renowned LexisNexis
Group. Similar computer network breaches occurred at about that time at ChoicePoint, Inc., and
at the Bank of America, prompting calls for the
U.S. federal government oversight through the
General Services Administration to investigate
the matter. Recommendations were to follow
about providing adequate protection for the safety
of federal employees information, and fears of
identity theft surfaced on a broad scale. In short,
the present-day reality is that regardless of how
well intentioned management is about protecting
employees privacy rights as they navigate online,
valuable personal information can be collected by
hidden online tools such as cookies (small bits
of data transmitted from a Web server to a Web
browser that personalize a Web site for users)
and Web bugsand then that information can be
shared with third parties for marketing purposes
or surveillance.
Moreover, with the increased usage in recent
years of cellular phones and hand-held computers, users have become vulnerable to security
breaches, for wireless communications rely on
open and public transmission media (over the
air). Thus, the mobile security challenges relate
to the users mobile device, the wireless access
network, the wired-line backbone network, and
mobile commerce software applications. All of
these challenges must be addressed by the companies releasing such products into the marketplace,
including providing fixes for software vulnerabilities. Moreover, unlike wire-line networks, the
uniqueness of wireless networks poses a number
of complex challenges for security experts, such
as vulnerability of the air interface, an open peerto-peer (P2P) network architecture (in mobile
and ad hoc networks), a shared wireless medium,
the limited computing power of mobile devices,
a highly dynamic network topology, and the low
data rates and frequent disconnects of wireless

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

communications. The media have often reported


that coping with these wireless devices vulnerabilities costs companies considerable money;
for example, the chargeback rate for credit card
transactions using wireless devices is about 15
times higher than that for in-store point-of-sale
credit card transactions (Grami & Schell, 2004).
Mobile services, in general, are prone to two
types of risks as a result of security vulnerabilities: subscription fraud (more commonly known
as identity theft) and device theft (i.e., stolen
devices).
In 2003, the notion of the vulnerability of
wireless computer systems came to the forefront
in an interesting case involving a U.S. computer
security analyst named Stefan Puffer. After he
determined that the Harris County district clerks
wireless computer network was vulnerable to
crack attacks, he warned the clerks office that
anyone with a wireless network card could gain
access to their sensitive data. Though Puffer was
charged by police for cracking the network, he
was later acquitted by a Texas jury. Moreover,
on September 8, 2003, while many students were
returning to school, the media reported that a
young hacker from Massachusetts pleaded guilty
to cracking Paris Hiltons T-Mobile cellular phone
and dumped personal information of hers on the
Internet for millions around the world to see. He
also admitted to cracking other computer systems
a year earlier and to stealing personal information
without authorization. But his exploits did not stop
there; the youth said he also sent bomb threats
through the Internet to high schools in Florida
and Massachusetts. The authorities estimated that
the harm created to persons and to property for
these exploits cost an estimated US$1 million, and
like other juveniles who have cracked systems,
he was sentenced to eleven months in a detention
center (Schell, 2007).
Finally, the essence of any business transactiononline, in industry, or in retail outletsdepends on trust, commonly expressed in laws,
contracts, regulations, policies, and personal

reputations. Recent evidence has indicated that


though consumers would tend not to initially
trust someone knocking on a house door trying
to sell expensive goods, many consumers using
the Internet to purchase goods or to communicate
with others seem to be overly trusting. Numerous consumers fall victim to spammers (those
taking advantage of users e-mail accounts by
swamping them with unwanted advertising using
false but legitimate-looking headers), download
virus-infected software, or engage in online
chat rooms with strangerssometimes sharing
personal information which can later cause them
or their loved ones harm. Though many efforts
in the United States have been made to eradicate
spam, including the creation of filters to stop it
from getting through the network and the passage
of laws like the U.S. CAN-SPAM Act of 2003, to
date, these remedies have proven not to be fully
effective.
For example, on April 8, 2005, a landmark legal
case concluded that spammer Jeremy Jaynes of
Raleigh, North Carolina, who went by the name
Gaven Stubberfield, was guilty of massive
spamming and was sentenced to nine years in
U.S. prison in violation of the U.S. CAN-SPAM
Act. Described by prosecutors as being among
the top 10 spammers in the world, this case is
considered to be important because it was the
United States first successful felony prosecution
for transmitting spam over the Internet. Jaynes
apparently transmitted 10 million e-mails a day
using 16 high-speed lines. For these mal-inclined
exploits and breaches of trust, he allegedly earned
as much as $750,000 a month on his spamming
operation (Schell & Martin, 2006).
Another interesting cyber case of breach of
online trust occurred in March, 2005, when the
Harvard Business School administration said that
because of unauthorized intrusions, they decided
to reject almost 120 applicants who had followed
a hackers instructions on how to break into the
universitys admission Internet Web site to learn
if they had been admitted to this prestigious

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

university. The administrators argued that these


intrusions were unethical and that the actions
of the potential students breached trust. Other
universities took similar punitive approaches to
similar cracking incidents, including Carnegie
Mellon Universitys School of Business. The common thread involved in these breaches was the
use of the Apply Yourself online application and
notification software (Associated Press, 2005).
Breaking into a computer typically involves
discovering vulnerabilities and then creating an
exploit (a program or set of instructions to be followed) to take advantage of the vulnerabilities.
These vulnerabilities, as well as their related
exploit programs, if released into the public
domain, can be used by many other individuals,
regardless of their inclinations (Rescorla, 2005).
For example, system administrators tend to use
them to test their systems, and good-natured computer hackers maintain that they capitalize on the
vulnerabilities just to have a good time.
There are also the malicious crackers who scan
systems to determine which have vulnerabilities
and then plan an attack. Crackers typically aim to
get revenge on some perceived enemy or to make
a profit from the cyber attack. It is not uncommon,
in fact, for crackers to verify the success of their
attacks, for it brings them considerable pleasure
(Meinel, 2006).
While detailing how hackers break into computers is out of the scope of this chapter, readers
can refer to Appendix A: How Do Hackers Break
into Computers? by Carolyn Meinel to get a better idea of how exploits are completed, as cited
in this work (Schell & Martin, 2006).
Considering the significant amount of valuable
information contained in business, government,
and personal computers worldwide, it is necessary to consider the risks and prevalence of attacks against computer networks. According to
recent estimates, the costs to victims of malicious
computer attacks have totaled more than $10 billion since 2000, including recent cellular phone
exploits (IBM Research, 2006).

9

The CSI/FBI (Computer Security Institute/


Federal Bureau of Investigation) Computer Crime
and Security Survey has annually attempted to
assess the costs of computer attacks to industry,
medical and educational institutions, and government agencies for the past 12 years by asking those
involved in the IT security role to respond to the
questions posed. The most recent 2007 version
is simply known as the CSI, designed solely by
the Computer Security Institute.
The 2007 survey findings revealed that the
average annual loss reported in this past year
rose to $350,424 from $168,000 the previous
year. Not since the 2004 report have the average
losses been this high. Moreover, about one-fifth
of the IT security respondents said that their firms
tended to suffer a targeted attack, meaning
that the mal-ware attack was aimed exclusively
at their organization or those within a given
subset. Also, financial fraud in 2007 overtook
virus attacks as the major source of financial loss
in 2006. Another significant cause of loss was
system intrusion by outsiders. Insider abuse of
network access or e-mail (including trafficking
in child exploitation pornography or software
pirating) also edged-out virus incidents as the
most prevalent form of security problemwith
59% and 52% of the respondents, respectively,
reporting these (Richardson, 2007).
The 2006 CSI/FBI survey findings further
revealed that most IT respondents perceived that
their organizations major cyber losses resulted
from system breaches by outsiders (i.e., those
not employed by the company). However, about
33% of the respondents believed that insider
threats accounted for at least one-third of their
sizable network abuse problems (Gordon, Loeb,
Lucyshyn, & Richardson, 2006).
The 2006 survey results suggested that within
the enterprise security perimeter, the news is
good, for the survey respondents maintained that
they are keeping their cyber crime losses lower.
At the same time, in the developed and developing world, our economic reliance on computers

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

and technology is growing, and so are criminal


threats. Because criminal threats are becoming
more sophisticated, IT security experts around
the globe should not overestimate these recent
gains (Cincu & Richardson, 2006).
It is vital that researchers in a number of fields
better understand the psychological and behavioral
composition of network attackers and the social
dynamics that they operate within. Knowledge of
the personalities, behaviors, and communication
patterns of computer attackers can help security
researchers and practitioners to prepare for future
exploits and to reduce costs due to electronic intrusion, alteration, and theft. Better preparation can
also minimize damage to consumer confidence,
privacy, and security in e-commerce Web sites
and general information-sharing within and across
companies (Cincu & Richardson, 2006).

Background: A briefing on basic hacking/cracking vocabulary, the coincidence


of the four critical elements constituting
a cyber crime, and the common types of
unauthorized use
Issue and controversy #1: Known demographic and behavioral profiles of hackers
and crackersbehavioral misfits or seemingly normals?
Issue and controversy #2: Psychological
myths and truths about those in the computer underground (CU)do they tend to
be disease-prone or self-healer types?
Future trends: How present strategies for
dealing with online privacy, security, and
trust issues need to be improved

bAckground
purpose of thIs chApter
The purpose of this chapter is to summarize what
is known in the literature about the demographic,
psychological, and social/behavioral patterns of
computer hackers and crackers. This information
can improve our knowledge of cyber intruders
and aid in the development of effective techniques
and best practices to stop them in their tracks.
There is little question among IT security experts
that when it comes to privacy issues, hackers and
crackers are often ignored. As a matter of fact, if
crackers do attack an international database containing high degrees of personal and/or homeland
security information (with the 2005 LexisNexis
database exploit serving as a smaller-scale case in
point), this large-scale exploit could cause massive
disasters affecting citizens across multitudes of
jurisdictions, including a critical infrastructure
failure. This chapter intends to assist in shedding
light on what is known about how hackers and
crackers generally tend to think and behave.
Specific topics covered in this chapter include:

As noted, though the words hacker and cracker


are regularly used interchangeably by the media
and the public, these two terms have distinct meanings within the CU (Furnell, 2002; Holt, 2007).
The word hacker typically refers to a person who
enjoys learning the details of computer systems
and how to stretch their capabilities (Furnell,
2002). Crackers tend to be malicious meddlers
trying to discover information by deception or
illegal means, often with the intent to do harm to
another person or to anothers property for revenge
or personal gain (Furnell, 2002).
There are also variations of hackers within
the CU, based on their motives and actions while
hacking (Holt, 2007). For example, White Hat
hackers are individuals who use their skills to
benefit or protect computer systems. The term
Black Hat hacker often refers to those hackers
who maliciously damage or harm networks. In
this context, a hack to gain knowledge or serve
as a warning to security personnel in a company
that a computer system is not properly protected
may be defined by members of the CU as good
and positively motivated. However, if the hacking

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

event occurs because of a perpetrators need for


revenge, sabotage, blackmail, or greed, this action
may be labeled as wrong, and possibly criminal
in nature (Schell et al., 2002).
While the public and the media suggest that
the bulk of computer crack attacks, or exploits, are completed by sophisticated hackers
with a detailed knowledge of computer systems
and how they work, the reality is somewhat different. Evidence suggests that most perpetrators
who attack networks are younger than age 30,
are bored, and, often, are in need of peer recognition and acceptance (Schell et al., 2002). In
fact, most industrial and government computer
system invaders are not sophisticated hackers;
they are often teenagers out to be challenged and
to be recognized for their exploits by their peers.
The less skilled individuals who engage in these
sorts of attacks are typically referred to as script
kiddies (Furnell, 2002; Schell et al., 2002; Holt,
2007). This term is often used derisively within
the CU, as it recognizes a persons dependence
on pre-made scripts that facilitate hacks. Using
a program to hack suggests that the individual
does not have significant computer knowledge
and is not truly a hacker (Furnell, 2002; Schell
et al., 2002; Holt, 2007).
There are several other terms used to describe
hackers and computer attackers, particularly those
interested in politically-motivated cyber attacks.
For example, the term cyber terrorist refers to
individuals who use hacking techniques to attack
networks, systems, or data under the motivation
of a particular social or political agenda (Furnell,
2002), and hacktivists are those who break into
computer systems to promote an activist agenda,
often defacing Web sites to express an opinion
(Furnell, 2002).
While this list is by no means an exhaustive
one of the different labels applied to those in the
CU or their activities, it, nonetheless, demonstrates
the remarkable variation present in motives, thinking, and behavior within the CU.

9

Cyber Crime Defined


The growth and spread of the Internet across the
world over the last two decades has fostered the
growth of a variety of online crimes as well as
laws aimed at curbing these behaviors. In fact,
a unique debate has developed concerning the
definition of online crimes, using both cyber crime
and computer crime. Cyber crimes typically occur
because the individual uses special knowledge
of cyberspace, while computer crimes involve
special knowledge of computer technology. The
interrelated nature of these behaviors complicates
this definition process, and many individuals use
the terms interchangeably. As a consequence, the
term cyber crime will be used in this chapter
to refer to any crime completed either on or with
a computer (Furnell, 2002).
Cyber crime generally includes electronic
commerce (e-commerce) theft, intellectual
property rights (IPR) or copyright infringement,
privacy rights infringement, and identity theft.
Also, cyber crime involves such activities as child
exploitation and pornography; credit card fraud;
cyberstalking; defaming or threatening another
user online; gaining unauthorized access to computer networks; ignoring or abusing copyright,
software licensing, and trademark protection;
overriding encryption to make illegal copies of
software; software piracy; and stealing anothers
identity to conduct criminal acts. Although variations on the parameters constituting these unlawful acts, as well as the penalties corresponding to
the infringements, may vary from one jurisdiction
to another worldwide, this list is relevant and
pertinent (Schell & Martin, 2006).
Taken from a broad perspective, cyber crime is
not all that different from the more conventional
real-world crime. In fact, one of the most well
known cyber crime typology classifies behavior
along similar lines of traditional crime, including trespass, theft, obscenity, and violence (Wall,
2001). While this framework identifies multiple
forms of potentially criminal cyber behaviors,

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

the criminal acts of trespass (defined as entering


unlawfully into an area to commit an offense)
and theft (an act occurring when someone takes,
or exercises illegal control over, the property of
another to deprive that owner of the asset) tend
to garner the lions share of attention from the
media (Furnell, 2002).
Types of cyber trespass and theft commonly
include but are not limited to (Schell & Martin,
2004):

Floodinga form of cyberspace vandalism resulting in denial of service (DoS) to


authorized users of a Web site or computer
system;
Virus and worm production and releasea
form of cyberspace vandalism causing corruption and, possibly, the erasing of data;
Spoofingthe cyberspace appropriation of
an authentic users identity by non-authentic
users, causing fraud or attempted fraud and
commonly known as identity theft;
Phreakinga form of cyberspace theft and/
or fraud consisting of the use of technology
to make free telephone calls; and
Infringing intellectual property rights (IPR)
and copyrighta form of cyberspace theft
involving the copying of a targets information or software without getting their
consent.

the four critical elements of cyber


crimes
Harm resulting from cyber crimes, as in conventional crimes, can be to property, to persons,
or to both. As in the conventional world, in the
cyber world, there are politically-motivated cyber
crimes, controversial crimes, and technical nonoffenses. For a cyber crime and a conventional
crime to exist in U.S. jurisdictions, four elements
must be present (Brenner, 2001):

Actus reus (the prohibited act or failing to


act when one is supposed to be under duty
to do so)
Mens rea (a culpable mental state)
Attendant circumstances (the presence of
certain necessary conditions)
Harm (to either persons or property, or
both)

Here is an example illustrating the four


elements for a property cyber crime involving
criminal trespass, whereby the perpetrator intends
to steal information from another. A cyber perpetrator gains entry into a computer and unlawfully
takes control of the property, the information of
another user (actus reus). He or she enters with the
intent to commit an offense by law and acts with
the intent of depriving the lawful owner of data
(mens rea). By societys norms, the perpetrator
has no legal right to enter the computer network
(i.e., is not authorized to do so) or to gain control of
the targeted software (attendant circumstances).
Consequently, the cyber perpetrator is liable for
his or her unlawful acts, for he or she unlawfully
entered the computer (that is, criminal trespass)
to commit an offense once access was gained
(i.e., theft). In the end, the targeted user was not
able to access data, resulting in harm to the target
(Schell & Martin, 2004).

the changing nature of cyber crime


and the need for emerging
legislation
As the nature of cyber crime has evolved, so have
the legal structures to prosecute and punish these
behaviors. Most newsworthy cyber crime cases
have been prosecuted in the United States under
the computer crime statute 18 U.S.C. subsection
1030. The primary federal statute criminalizing
cracking was originally the Computer Fraud and
Abuse Act (CFAA), which was modified in 1996
by the National Information Infrastructure Protection Act and codified at 18 U.S.C. subsection

9

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

1030, Fraud and Related Activity in Connection


with Computers.
If caught in the United States, crackers are
often charged with intentionally causing damage
without authorization to a protected computer. A
first offender typically faces up to 5 years in prison
and fines up to $250,000 per count, or twice the loss
suffered by the targets. The U.S. federal sentencing guidelines for cracking have been expanded
in recent years to provide longer sentences from
20 years behind bars to life sentences if exploits
lead to injury or death of online citizens (Krebs,
2003). The targets of cyber crimes can also seek
civil penalties (Evans & McKenna, 2000).
It should be noted that while finding a cracker
may not be an easy task for law enforcement because of the rather anonymous nature of the cyber
environment, successfully prosecuting a cracker
can be even tougher, for enforcement ability falls
to any jurisdiction that has suffered the effects of
the crack. If, for example, the corporate targets
of the attack are in the United States, then U.S.
laws would apply. Practically speaking, only
those jurisdictions with the cracker physically
in their locale will be able to enforce their laws.
Therefore, though the United States or Canada
may attempt to apply their countrys laws to any
given cracking incident, the perpetrator needs to
be physically in their jurisdiction to enforce the
law (Walton, 2000).
After the September 11, 2001 terrorist attacks
on the World Trade Center, the U.S. government
became increasingly concerned about terrorist
attacks of various natures and homeland security
protection. To this end, the U.S. passed a series of
laws aimed at halting computer criminals, including the 2002 Homeland Security Act, with section
225 known as the Cyber Security Enhancement
Act of 2002. In 2003, the Prosecutorial Remedies
and Tools against the Exploitation of Children
Today Act (PROTECT Act) was passed to assist
law enforcement agents in their efforts to track
and identify pedophiles using the Internet for
child exploitation purposes. Also in 2003, the
Can Spam Act was passed by the United States
98

Senate, aimed at decreasing the issues raised by


commercial e-mailers and spammers. Its longer
title was the Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003,
a title accurately reflecting its purpose.
Other countries have enacted similar antiintrusion legislation. For example, section 342.1
of the Canadian Criminal Code is aimed at a
number of potential harms, including theft of
computer services, invasion of privacy, trading
in computer passwords, or cracking encryption
systems. Charges for violations are made according to the sections of the Criminal Code dealing
with theft, fraud, computer abuse, data abuse,
and the interception of communications (Schell
& Martin, 2004). In fact, the Global Cyber Law
Survey of 50 countries around the world found
that 70 % of countries had legislation against
unauthorized computer access, as well as data
tampering, sabotage, mal-ware or malicious
software usage, and fraud.

Issue And controversy #1:


knoWn demogrAphIc And
behAvIorAl profIles of
hAckers And
crAckersbehAvorIAl mIsfIts
or seemIngly normAls?
In light of the significant agreement on the potential harm caused by unauthorized computer
intrusions and malicious cracks, it is necessary to
consider the demographic, behavioral, and social
composition of the CU. In short, who is likely to be
responsible for these exploits, and are they really
all that different from individuals in mainstream
society? While the clothes that hackers wear seem
to have shifted a bit from the 1960s (when long hair
and sandals were the norm) through the present
(where backpacks and black t-shirts are the norm),
hackers and crackers in the experimental phase
still seem to be predominately males under age
30 (Gilboa, 1996; Jordan & Taylor, 1998; Schell
et al., 2002). But why is this?

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

theoretical framework
From its very beginning as an all-male Tech
Model Railroad Club in the 1960s at MIT, where
the geeks had an insatiable curiosity about how
thingsand particularly how a slow-moving hunk
of metal called the PDP-1 workedthe CU has
attracted to this day predominantly men to its
fold. Back then, because of the PDP-1s turtle-like
pace, the smarter computer programmers at MIT
created what they called hacks, or programming
shortcuts, to complete their computing tasks more
efficiently. In fact, the clubs adoption of the term
hacker to describe themselves as well as their
acts indicated a creative individual who could
push the envelope around what computers
were designed to do. The clubs talented hackers
became the seed of MITs Artificial Intelligence
(AI) Lab, the worlds prime center of AI research.
In 1969, the AI labs fame and influence spread
fast, the year in which the Advanced Research
Projects Agency Network, or ARPANET, was
formed, the first transcontinental, high-speed
computer network created by the U.S. Defense
Department as an experiment in digital communications (Schell et al., 2002)
It is interesting to note that the positive, creative
reputation associated with those in the CU has
over the years taken on a negative connotation.
Since the 1980s, the media seem to have focused
on the darker side, frequently reporting the costs
due to property and personal harm as a result of
computer exploits by those in the CU. Moreover,
this rather less than positive picture has also been
painted by theorists trying to understand this
rather unique population.
One school of thought posited by psychosexual
theorists argues that hacking can be viewed as a
way for young men to fulfill their erotic desires
through electronic means (Taylor, 2003). This
notion is generated, in part, by stereotypical conceptions of hackers as introverted, socially inept,
or awkward males who have difficulty relating to
others (Furnell, 2002; Taylor et al., 2006). Certain

technologically-gifted young males inability to


connect in meaningful ways with other people,
especially women, these psychosexual theorists
argue, drive them to spend more their time
with computers and technology (Taylor, 2003).
Through very focused and solitary involvement
with their computers, young men in the CU
become intimately connected with technology
and with hacking. The shouting and swearing in
arcades, the fixation on war and sports games, the
focus on speed, and the presence of primarily men
on software packages and computer games are all
characteristics of this broader-based computer
culture that is stereotypically male (Cohoon &
Aspray, 2006).
These psychosexual theorists further maintain
that the knowledge base young hackers and crackers develop allows them to take out their sexual
frustrations through their computer hardware and
software. The effort taken to bend a computer
system to an individuals will can provide a sense of
physical gratification that directly mirrors physical
masturbation (Keller, 1988; Ullman, 1997). The
destructive or vandalism-based hacks of young
male script kiddies, in particular, are similar to
masturbation in that they usually have no necessary objective or goal aside from the pleasure of
the act itself (Taylor, 2003). These activities may,
however, have little appeal for women, especially
when coupled with their difficulty of communicating with self-focused male hackers (Jordan &
Taylor, 2004).
A second and more inclusive explanation of
the male stereotype of those in the CU relates
to gender role socialization on-line and off-line
(Jordan & Taylor, 2004; Taylor, 2003). Wajcman
(1991) suggests that gender and technology influence and shape one another, such that technology is a source of and consequence of gender
relationships. For example, there are gender
differences in the way that humans interact with
and approach technology (Turkle, 1984). Males
tend to use hard mastery techniques involving
a distinct and decisive plan to impose their will

99

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

over the device, whereas women tend to practice


soft mastery techniques based on interaction
with the device and responding to its processes
(Turkle, 1984). Moreover, these theorists argue,
over recent decades, gender role socialization
has resulted in an ongoing interest by males in
technologyto the near exclusion of females.
The creators of early computers, it is posited,
were mostly men who used analytical and rational
thinking to develop technology, especially the
command-line interfaces of the early systems
(Levy, 1984). These traits are in direct contrast to
the prevalent role-specific traits purported to exist in females (nurturing, caring, and protecting).
The gender role socialization theorists argue that
given this societal bias, since the 1960s, young
females have not been steered by male mentors
to become enamored with technology, primarily
because it runs counter to females supposed
natural personality traits and softer approaches
to technology.

present-day myths and reality


If asked to describe the demographic, behavioral,
and social composition of those in the CU, most
people in mainstream society would probably
suggest that hackers have a low tolerance for
business suits and business attirepreferring to
wear clothes optimizing comfort, function, and
minimal maintenance. Onlookers would also
likely add that hackers are obviously connected
to their computers, perhaps even addicted to them.
Also, mainstream citizens would likely suggest
that besides their computers, hackers as a group
would seem to like music, chess, war games,
and intellectual games of all kinds. Moreover,
assert onlookers in the mainstream, if hackers
were to engage in sports, they would probably
choose those that are self-competitive rather than
team-orientedas afforded by the martial arts,
bicycling, auto racing, hiking, rock climbing,
aviation, and juggling. In terms of religion and
self-control, most mainstream onlookers might

00

suggest that hackers would probably describe


themselves as agnostics, atheists, or followers of
Zen Buddhim or Taoism, and they would probably
tend to avoid substances that would make them
become cognitively stupid (such as alcohol), but
they would tend to ingest high-caffeine drinks
as an aid to staying awake long hours so that
they could hack. And when communicating with
one another online, many mainstream onlookers
would probably add that hackers would tend to use
monikers (like Mafiaboy) rather than their own
names. It is interesting to note that given these
rather superficial attributes commonly assigned
to hackers by mainstream citizens, many of the
hackers, when asked, would tend to concur with
these observations. As a matter of fact, this is a
subset of descriptors for hackers appearing from
1996 through 2002 at the popular hacking Web
site http://project.cyberpunk.ru/links.html.
Aside from these appearance descriptors,
little else but myths about those in the CU existed until just a few years ago. Contrary to some
commonly-held beliefs among the public and the
media that hackers are behavioral misfits with
odd sexual relationships (whereby bi-sexual and
trans-sexual relationships outnumber those in the
adult population), are largely unemployed, and
have strange sleeping patterns (with a propensity
to hack through the night), a comprehensive behavioral and psychological assessment study by
Schell et al. (2002) on 210 hacker attendees of the
Hackers on Planet Earth (HOPE) 2000 gathering
in New York City and of the DefCon gathering
in Las Vegas in July, 2000, found that these three
behavioral myths about those in the CUand
other mythswere generally unfounded.
Of the respondents in this Schell et al. (2002)
hacker study, 91% were males, 9% were femalesrepresentative, the authors argued, of the
predominance of males in the CU population. The
mean age of respondents was 25 years, with the
youngest respondent being 14 years and with the
eldest being 61 years.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Regarding sexual relationship propensity, a


significant 79% of the hacker survey respondents
(males and females) claimed to be monogamous
heterosexual, and those approaching or exceeding
age 30 were likely to be gainfully employedwith
the mean salary for males placing at about $57,000
(n = 190) and with the mean salary for females
placing at about $50,000 (n = 18). The largest
reported income was $700,000. So from a sexual
proclivity and employment standpoint, males and
females in the CU seem to be quite similar to those
in mainstream society and are generally sound
contributors to the economic wealth of society.
Moreover, from a financial angle, a t-test on the
findings revealed that female hackers seem to be
as financially rewarded as their male peers.
The hacker study authors also affirmed that
there were some statistically significant findings
of demographic and behavioral mean score differences between the White Hats and the Black
Hats in the study. The White Hats were designated
as those who reported being motivated primarily by achievement or by societal/organizational
gains (such as advancing networks or software
and computer capabilities and hacking to expose
weaknesses in organizations computer systems
or in their productsbut for the overall good of
society), whereas the Black Hats were designated
as those who reported being motivated by narcissistic needs such as hacking for personal financial
gain without regard to the personal and property
costs to others or for enhancing their reputation
within the hacker community/world without
regard to the costs to others.
Using these definitions, Schell et al. (2002)
classified 12 of the respondents as Black Hats.
Demographically, these individuals were, on
average, 27 years old (1 year younger, on average, than their White Hat counterparts), male
(but not exclusively), and American. The Black
Hats earned significantly more than their White
Hat counterpartsalmost double. In fact, the
Black Hats (male and female) earned, on average,
$98,000 annually, while the White Hats (male

and female) earned, on average, $54,000. Moreover, the Black Hat and White Hat males tended
to work in large companies (with an average of
5,673 employees) and were generally not charged
with hacking-related crimes (but not exclusively),
whereas the females (White Hat and Black Hat)
tended to prefer working in smaller companies
with an average of about 1,400 employees.
Interestingly and consistent with previous
study findings and with the myth about hackers
that they value information and activities that
make them smarter, both the Black Hat and the
White Hat hackers in the Schell et al. (2002) study
tended to be self- and other-taught (like the founding hackers at MIT) and were quite well educated,
with at least a community college education. The
female White Hats and Black Hats, consistent with
the gender role socialization theory, admitted to
generally learning their computer skills later in life
at college or university, largely because they were
not steered in the technology direction by parents,
teachers, or career counselors. Also, consistent
with Meyers (1989) earlier study suggesting that
neophyte hackers are drawn to computers from
an early age and tinker with them on their own
time, the most frequent response (39%, n = 83) to
the item asking the hackers (male and female) how
they learned their computer skills was that they
were self-taught. The next largest pocket (7% of
the respondents) said that they were self-taught,
completed formal courses and courses on the job,
and learned from friends and relatives.
Regarding the myth in the public domain about
odd sleeping patterns of hackers in the CU, a significant 79% of the hacker respondents (males and
females) said that they sleep sometime during the
night from 12 midnight through 8 A.M.similar
to individuals sleeping patterns in mainstream
culture. Thus, Schell et al. (2002) noted that this
myth about odd sleeping patterns was unfounded
by their study findings.
Regarding prevailing culture within the CU,
Thomas (2002) has suggested, as earlier noted, that
those in the CU operate within a predominately

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

competitive culture centered on demonstrations of


mastery, wherein talented individuals show their
ability to dominate their social and physical environment using their computers and their brains.
Those in the CU taunt and challenge the ability
of others while online, with the ultimate goal of
gaining complete control of another individuals
computer system, often referred to as taking root
or 0wning a box. Sometimes this taunting can
get out-of-hand, moving into the adverse impact
areas of cyber harassment and cyber stalking.
According to popular myth and consistent with
the assertions of Thomas (2002), the Schell et al.
(2002) study findings revealed that the male and
female hackers adopt online monikers to protect
their anonymity and privacy while online and to
decrease the chances of being cyber harassed.
The majority63% of the respondentssaid
that they typically use a net handle or moniker
to identify themselves online, with only 10% of
the respondents admitting to use their birth name
alone, and with 27% claiming that they use a
combination of their birth names and net handles
while online. Out in the real world, the hacker
respondents affirmed that they tended to use their
birth names only, indicating that when they are
operating in mainstream society, they take on a
rather mainstream persona. Again, a significant
56% of the respondents said that they used their
net handles specifically for hacking activities.
In this 2002 study, as suggested more recently
by Holt (2007), the female hackers confirmed
that they may adopt male monikers or handles to
obfuscate their gender, to increase the likelihood
that they are accepted by others in the CU, and
to reduce the fear that they may be exposed to
high levels of harassment via flaming (the situation where individuals direct online obscene or
abusive messages to another online user to upset
that individual and to provoke distress).
Finally, contrary to the stereotypical communication patterns posited by the psychosexual
theorists regarding the stereotypical conceptions
of hackers as introverted, socially inept, or awk-

0

ward males who have difficulty relating to others


(Furnell, 2002; Taylor et al., 2006), the Schell et al.
(2002) study findings paint a somewhat different
picture. While the psychosexual theorists tend to
agree with the prevailing myth about hackers that
they communicate only with their computers and
not with other people and that they are loners, the
just-cited 2002 study found that, as Meyers earlier
1989 study reported, hackers spend considerable
time during the week communicating with their
colleaguesabout 25%. Moreover, while 57%
of the respondents said they like to hack solo,
the remaining 43% of the respondents (male and
female) said that they prefer to collaborate with
others when hacking.
In summary, the literature seems to suggest
that as more becomes known about those in the
CU, the picture that emerges is quite different from
the dark-side palette that prevails in the minds
of the public, the media, and the psychosexual
theorists. More support appears to be accumulating that paints a clearer picture of the positive
sides of those in the hacker community, debunks
a number of demographic and behavioral myths
prevailing about individuals in the CU, and points
to the accuracy of the gender role socialization
theory. Considering the concern of industry and
society about cyber crimesand the huge costs
to society regarding breaches of privacy, security,
and trustare there some psychological indicators that may point to a proclivity to cause harm
to property and persons by those in the computer
underground?

Issue And controversy #2:


psychologIcAl myths And
truths About those In the
computer underground
(cu)do they tend to be
dIseAse-prone or self-heAler
types?
In light of the significant agreement on the potential harm caused by unauthorized computer intru-

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

sions and malicious cracks by law enforcement


agents, the public, industry, and those in the CU, it
is necessary to consider the psychological makeup of those operating daily in the CU. In short,
who is likely to be responsible for these exploits,
and are these hackers really all that different from
individuals in mainstream society?

theoretical framework
At the start of this chapter, we addressed a number
of concerns that industry and the general public
have about privacy, security, and trust. However,
in recent months, even those in the CU have
expressed concerns about harmful episodes that
have jeopardized their psychological safety and
could interfere over the longer term with their
personal safety. For example, in March, 2007,
anonymous online death threats were levied
against Kathy Sierra, a popular Web developer
within the information technology community,
author, and blogger who encourages companies
to consider human behavior when designing their
technological products. While many bloggers rallied to her supportonline and off-line, a number
of women and men got online to talk about their
incidents of online bullying, harassment, and
stalking (Fost, 2007).
By definition, online bullying entails verbally
abusing targets by threatening to cause harm to
ones reputation; cyber harassment uses cyber
space to harass a targeted individual; and cyber
stalking occurs when individuals repeatedly
deliver online unwanted, threatening, and offensive e-mail or other personal communications
to targeted individuals, including death threats
(Schell & Martin, 2006). All of these threats are
intended to cause psychological damage to others,
and some of these exploits may actually result in
death to the targets.
One of the most well known cyber stalking
cases reported in the popular media involved a
young cracker named Eric Burns (a.k.a. Zyklon).
Eric Burns claim to fame is that he attacked the

Web pages of about 80 businesses and government offices whose pages were hosted by Laser.
Net in Fairfax, Virginia. Burns, a creative individual, designed a program called Web bandit
to identify computers on the Internet that were
vulnerable to attack. He then used the vulnerable systems to advertise his proclamations of
love for a young classmate named Crystal. These
computer exploits by Burns became his way of
advertising worldwide his unrelenting love [or,
more accurately, to get the attention of and then
to take control of or to take root of] Crystal. He
hoped that by proclaiming his love in the cyber
world, he would, hopefully, get her attention, if
not her long-term commitment. This real-world
case ended with the 19-year-old male pleading
guilty to attacking the Web pages for NATO and
Vice President Al Gore. In November, 1999, the
judge hearing the case ruled that Burns should
serve 15 months in federal prison for his cracking
exploits, pay $36,240 in restitution, and not be
allowed to touch a computer for 3 years after his
release. The irony in this case is that the young
woman named Crystal attended the same high
school as Eric but hardly knew him. In the end,
she assisted the authorities in his capture, but Eric
Burns, in his role as cyber stalker, did not make
the media headlines, just the fact that he cracked
Web sites (Reuters, 1999).
Mental health experts who assessed Eric Burns
declared that he likely felt more comfortable communicating online than in person, he had difficulty
overcoming his fear of rejection by people in the
real world, he lacked the social skills to repair
relationships, and he was mocking authority by
saying something like I can put my favorite
girls name on your Web site for everyone to see,
but you cant get me. In short, Eric Burns had a
number of pre-existing mental health issues that
he acted out online (Schell et al., 2002).
The case of Eric Burns is symbolic on a number of planes, including that most of what the
literature reports about the psychological profiles
of those in the CU has been gleaned from legal

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

documents following convicted crackers arrests,


both insiders and outsiders. Moreover, Eric Burns
as a cyber stalker is likely not too different from
real-world stalkers who have been charged and
convicted. Stalking experts tend to agree that
those who commit these crimes in the virtual
world are likely not much different psychologically from those who commit such crimes in the
mainstream (Schell & Lanteigne, 2000).
So when these crackers are caught, convicted,
and analyzed by mental health experts, how are
they classified? In general, mental health experts
tend to describe the overall profile of working
adults as being predominantly self-healing in
nature (i.e., they recover well from stressful life
episodes, and they are generally psychologically
and physically well individuals over the longer
term) and predominantly disease-prone (i.e.,
they tend not to recover well from stressful life
episodes, they often suffer from bouts of depression in the short-term, and in the longer term,
they often suffer from diseases such as premature
cardiovascular disease or cancer) (Schell, 1997).
As noted, Eric Burns would likely be placed in
the category of disease-proneness.
Consistent with some earlier reported offender profile findings developed on computer
science students and information systems (IS)
employees, Shaw, Post, and Ruby in 1999 said that
insider computer criminals, in particular, tend
to have eight traits that are more disease-prone
than self-healing. In particular, insider crackers
who cause damage to company computers and/or
to individuals within these organizations are: (1)
introverted; (2) they have a history of significant
family problems in early childhood, leaving
them with negative attitudes toward authority;
(3) they have an online computer dependency
that significantly interferes with or replaces direct
social interactions in adulthood; (4) they have an
ethical flexibility that allows them to justify their
violations, even if they get caught; (5) they have
a stronger loyalty to their computer specialty
than to their employers; (6) they have a sense of

0

entitlement, thinking that they are special and


are thus owed the corresponding recognition; (7)
they tend to have a lack of empathy, preferring to
disregard the adverse impact of their actions on
others; and (8) because of their introverted natures, they are less likely to seek direct assistance
from their supervisors or from their companys
employee assistance program (EAP) when they
are distressed.
Earlier in 1997, Ehud Avner constructed what
he called a personality analysis he completed
in several countries for mal-inclined information
systems employees. His results found that the prototypical insider IT criminal rarely gets caught,
because he or she tends to be a manager or a highranking clerk without a criminal record, he or she
commits the crack in the course of normal and
legal system operations, and he or she seems on
the surface to be a bright, thorough, highly-motivated, diligent and a trustworthy employeeuntil
caught. When caught, he or she tends to say that
he or she did not intend to hurt anyone, that banks
steal more than he or she did, or that he or she only
tried to prove to their employers that it is possible
to crack the vulnerable system. It is important to
remember that the probable reason for their not
getting caught earlier is that these cyber criminals
frequently have a considerable number of White
Hat traitsas well as Black Hat oneswhich
generally keep them out of investigators eyes
(Schell et al., 2002).
Considering this growth in knowledge about
insider and outsider convicted crackers, noted
Schell et al. in 2002, what was still sorely missing from the literature on those in the CU as of
2000 was a psychological profile of the White Hat
hackers in the CU. What were they like psychologically, and did they differ substantially from
the Black Hats or those charged and convicted
of cyber crimes?

present-day myths and reality


While a number of researchers have increasingly
reported that disease-prone types and self-healing

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

types likely exist in the CU, few have been able


to describe the relative proportion of each. Accepting this observation, researchers have become
curious about the supposed lack of bias and fair
play in the CU, as in recent times, anecdotal
evidence (such as that of Sierra in March, 2007)
seems to indicate that some degree of rejection
or adverse attention-seeking by online peers is a
real concern and needs to be addressed as a major
online privacy, security, and trust issue within the
CU (Gilboa, 1996; Holt, 2006; Jordan & Taylor,
1998, Sagan, 2000; Schell, 2007).
As noted in the demographic section, the media
have focused on the disease-prone side of those in
the CU. Myths surfacing in the headlines since the
1980s, some founded and some not, include the
notion that hackers seem to have troubled childhoods, marked by a history of alcoholic parents,
abandonment by one or more parents, and parental
discord (Shaw et al., 1999)and that this could
be a primary reason for their apparent need to
rebel against authority figures. The Canadian
cracker known as Mafiaboy, who in 2000 raised
concerns when he cracked Internet servers and
used them as launching pads for denial of service
(DoS) attacks on the Web sites of Amazon, eBay,
and Yahoo, is a poster boy for a cracker with a
troubled childhood.
Another prevailing myth about hackers is
that because they have the predisposition and the
capability to be multi-tasked (Meyer, 1989) and appear to be computer addicted (Young, 1996), they
are likely stressed-out in the short-term and are
likely to be cardiovascular-prone at early ages (i.e,
disease-prone Type A) over the longer term. Thus,
argues those who cite this myth, label hackers in
the CU as predominantly disease-prone.
Finally, if there is a bright light to shine in
terms of prevailing hacker myths, it is that hackers are generally perceived by the public and
the media to be creative individualsa definite
self-healing trait.
As noted, the Schell et al. 2002 study presents
the most comprehensive picture, to date, of those

inhabiting the CU, including their psychological


make-up.
Regarding the first myth about childhood
trauma for those in the CU, while 28% of the hacker
convention respondents in this 2002 study said
that they had experienced childhood trauma or
significant personal losses, the majority of hacker
respondents did not make such claims. However,
supportive of this myth, of those hackers who
reported having a troubled childhood, the majority61%said that they knew that these events
had a long-term adverse impact on their thoughts
and behaviors. Moreover, a t-test analysis on the
findings revealed that female hackers were more
likely to admit experiencing childhood trauma
than their male counterparts, but there was no
significant difference in the reporting of childhood trauma for those charged and not charged
of crimes, or for those hackers under ago 30 and
those over age 30 (Schell et al., 2002).
Though many mental health experts would
seem to suggest those in the CU are predominantly
task-obsessed and perfectionist Type A individuals who are cardiovascular self-destructors,
Schell et al. (2002) reported that those individuals
attending hacker conventions actually tend to be
more moderated, self-healing Type B individuals in nature.
Moreover, a current study being undertaken by
the authors of this chapter seems to indicate that
those attending hacker conventions may actually
possess relatively high degrees of the Aspergers
Syndrome, or the Nerd Syndrome, and that this
constellation of traits may, in fact, protect hackers
in the CU from becoming too stressed-out by the
highly competitive and, at times, very aggressive
nature of this virtual environment (Fitzgerald &
Corvin, 2001). The ability to screen out distractions is a geeky trait that can be extremely useful
to computer programmers, in particular, and many
of these traits in milder forms seem to be descriptive of computer hackers (Nash, 2002).
Regarding the third bright light myth, Schell
et al. (2002) corroborated with their study findings

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

that those in the CU are creative individuals who


report using predominantly complex analytical
and conceptual styles of problem-solving and
decision-making. These researchers reported
that on the 20-item Creative Personality Test
of Dubrin, for example, the majority of hacker
respondents62%had scores meeting or
exceeding the critical creative level score of
15, thus supporting this myth. Further analysis
revealed no significant differences in the creativity mean scores for the males and the females,
for those charged and not charged of crimes in
the CU, and for those under age 30 and over age
30. Moreover, using the 20-item Decision Style
Inventory III of Rowe and colleagues, these
researchers found that the highest mean scores
for decision-making and problem-solving styles
of hackers placed in the analytic and conceptual
styles, supporting the myth that those in the CU
are cognitively complex and creative in their
thinking predispositions.
In closing, in an effort to try to determine who
might have high degrees of disease-proneness and
the propensity to cause harm to persons and to
property in the CU, Schell et al. (2002) completed
a number of analyses on the Black Hat segment
that admitted to being motivated to take revenge
on targets. The traits for these individuals were
that they tended to be under age 30, they had less
formal education than their White Hat peers, they
reported significantly higher hostility and anger
stress symptoms in the short term than their White
Hat peers, they reported higher depression stress
symptoms and more longer-lasting (and addictive)
hacking sessions than their comparative group,
and they had higher psychopathic and highly
narcissistic personality predispositions than their
self-healing counterparts. Thus, the researchers
concluded that that the group of hackers in the
CU most at risk for committing self- and otherdestructive acts appeared to be under age 30,
narcissistic, angry, obsessive individuals suffering from repeat bouts of depression. The authors
suggested that the percentage of high-risk hackers

0

ready to cause harm to persons, in particular, may


be as high as 7%.

future trends: hoW present


strAtegIes for deAlIng WIth
onlIne prIvAcy, securIty And
trust Issues need to be
Improved
This chapter has discussed, at length, about how
evolving cyber crime legislation and an increased
knowledge of the demographic, psychological, and
social/behavioral propensities of those in the CU
may, combined, lead to better methods of not only
curbing cyber crime but of better understanding
who may commit it, how, and why.
However, other solutions developed to deal
with the privacy, security, and trust issues in
cyber space have been developed and need to be
discussed.

solutions for online privacy


Recent public surveys have shown that a number
of consumers are still afraid to buy goods and services online, because they fear that their personal
information (particularly credit card and social
security numbers) will be used by someone else.
Moreover, despite assurances from credit card
companies that they will not hold consumers accountable for any false charges, in recent times
trust seals and increased government regulation
have become two main ways of promoting improved privacy disclosures on the Internet.
Trust seals nowadays appear on e-business
Web sitesincluding green Truste images, the
BBBOnLine (Better Business Bureau OnLine)
padlocks, and a host of other privacy and security
seals. In fact, some companies are paying up to
$13,000 annually to display these logos on their
Web sites in the hopes of having consumers relate
positively to their efforts to provide online privacy.

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

In fact, almost half of the Fortune 100 companies


display such logos, and of the fourteen information
technology Web sites in the Fortune 100 companies, 10 have such seals (Cline, 2003).
The question, however, remains about whether
trust seals really work. Although they are intended
to advance privacy for consumers through legislationprimarily through business self-regulationcritics often suggest that these trust seals
are more of a privacy advocate for corporations
than for consumers. But those supporting the
usefulness of trust seals note that if companies
display them on their Web sites, they need to
follow the trust standards, provide online clients
with a means of opting out of direct marketing
and having their personal information sold to
third parties, and give consumers a way to access
the companys information and file complaints.
Also on the positive side, ScanAlert, an emerging
security seal provider, argues that Internet sales
are reported to increase by 10% to 30% if the
trust seals are displayed (Cline, 2003).
From consumers submissions, a study conducted by Flinn and Lumsden (2005) indicated
that 42% of the consumers surveyed in their study
said that they were more likely to trust a Web site
that displays a trust mark than those not having
the display, and 49% of the respondents said that
they are likely to trust a Web site only if they are
able to recognize the trust mark program.
Furthermore, while government regulations
in North America are increasing to advance the
privacy of citizensby passing laws like the
Canadian Personal Information Protection and
Electronic Documents Act (PIPEDA) and the U.S.
Health Insurance Portability and Accountability
Act (HIPAA) of 1996, citizens are commonly
uneasy with such pieces of legislation, for consumers dislike having their telecommunications
traffic monitored by government agents. As a
result of these privacy concerns, the field of information ethics (IE) has emerged to deal with
issues arising from connecting technology with
concepts such as privacy, intellectual property

rights (IPR) information access, and intellectual


freedom. Although IE issues have been raised as
early as 1980, nowadays the field is more complexspurred on by the concerns of a number of
academic disciplines regarding Internet abuses.
According to IE, information itself, in some form
or role, is recognized to have intrinsic moral value.
Thus, theoreticians have formulated a number of
complex mathematical solutions for providing
better information protection over the Internet.
This is a field that is sure to bring more comprehensive solutions to online privacy concerns in
future years.

solutions for online security


Generally, businesses and government agencies
take two kinds of approaches to prevent security
breaches: proactive approachessuch as preventing crackers from launching attacks in the first
place (typically through various cryptographic
techniques) and reactive approachesby detecting security threats after the fact and applying
appropriate fixes. The two combined allow for
comprehensive network solutions. In technical
circles, securing Web sites generally refers to the
use of SSL (secure sockets layer) technology for
encrypting and authenticating HTTP (hypertext
transfer protocol) connections (Flinn & Lumsden,
2005).
Moreover, because network security is a chain,
it is only as secure as its weakest link. Although
enhanced network security features are desirable,
they cost moneya key consideration factor for
companies and institutions, especially the smaller
ones that are often reluctant to apply an enhanced
security solution because of prohibitive costs.
These costs are associated with, for example, additional overhead (such as increased bandwidth),
increased complexity (requiring specialized
security personnel), and information processing
delays such as degraded performance, which can,
in turn, degrade network performance (Grami &
Schell, 2004). Thus, if smaller companies in the

0

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

chain cannot afford appropriate security features,


the whole chain is vulnerable to attack.
Also, industry has come forward in recent
years with some innovative commercial tools to
assist network administrators in preventing network intrusions. One such tool, designed by Dan
Farmer and Wieste Venema in 1995, is known as
SATAN (security administrator tool for analyzing networks). This tool works by procuring as
much data as possible about system and network
services, and upon discovering vulnerabilities,
it gives rather limited data to network administrators to assist them in fixing the problem. Its
successor, SAINT, is also on the market to assist
in this regard.
Other improvements are on the way to help
improve online security. From a standard perspective, the emerging IEEE 802.11i standards will
improve wireless security issues, in particular,
and turn wireless networking into a more trusted
medium for all users, including the prevention of
DoS problems caused when the entire network is
jammed. (The attack could be against the clients
wireless device or against the networks access
point). Jamming has, to date, been difficult to
stop, largely because most wireless local area
networking technologies use unlicensed frequencies and are subject to interference from a variety
of sources (Grami & Schell, 2004)

solutions for online trust


As noted earlier in this chapter, a major barrier
to the success of online commerce has been the
fundamental lack of faith between business and
consumer partners. This lack of trust by consumers is largely caused by their having to provide
detailed personal and confidential information to
companies on request. Also, consumers fear that
their credit card number could be used for purposes
other than that for which permission was given.
From the business partners trust vantage point,
the company is not sure if the credit card number
the consumer gives is genuine, is in good credit

08

standing, and actually belongs to the consumer


trying to complete the transaction.
In short, communicating with unknowns
through the Internet elicits two crucial sets of
questions that require reflection: One, what is
the real identity of other person(s) on the Internet
and can their identities be authenticated? Two,
how reliable are other persons on the Internet,
and is it safe to interact with them? (Jonczy &
Haenni, 2005).
Consumer and business trust online is based
on such authentication issues, and in recent years,
a number of products have been developed to assist in the authentication process, including the
following (Schell, 2007):

Biometrics, which assess users signatures,


facial features, and other biological identifiers;
Smart cards, which have microprocessor
chips that run cryptographic algorithms and
store a private key;
Digital certificates, which contain public or
private keysthe value needed to encrypt
or decrypt a message; and
SecureID, a commercial product using
a key and the current time to generate a
random number stream that is verifiable
by a server, thus ensuring that a potential
user puts in a verifiable number on the card
within a set amount of time (typically 5 or
10 seconds).

Experts agree that trusted authentication


management in a distributed network like the
Internetwhich is the importance of the second
questionis not easy, for in a decentralized authority, every user is also a potential issuer of credentials. Furthermore, a given set of credentials,
perhaps issued by many different users, forms a
credential network. Therefore, a web of trust
model and solution was introduced by pretty good
privacy (PGP), a popular application for e-mail
authentication. Without getting too technical,

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

PGP organizes public keys and corresponding


certificates in local key rings. The owner of the
key ring gets a web of trust by assigning trust
values to issuers. This web of trust thus acts
as the basis for a qualitative evaluation of the
authenticity of the public keys involved. In PGP,
the evaluation of a web of trust is founded on
three rules and on the production of two outcomes:
valid or invalid (Jonczy & Haenni, 2005).
Unfortunately, to date, authentication systems
like PGP have failed to gain large acceptance and
to solve real-world trust problems such as spam,
primarily because they suffer from a number
of deployment usability issues, as well as trust
management issues. For example, in web of
trust style systems, Internet users must validate
keys out-of-band, which is a laborious task. Better
solutions need to be developed which minimize
these shortcomings.
Finally, it is important to recognize that in
recent years there has been some effort by experts to set standards and indicators for a more
systematic and coordinated fashion to capture the
trustworthiness state of a particular information
technology infrastructure, including the Internet.
Such indicators would reflect the assurance of
the IT infrastructure to reliably transfer information (including security, quality of service, and
availability of service)thus increasing consumers trust in the network. These indicators could
then be used to identify areas of the infrastructure requiring attention and be utilized by an IT
organization to assess the return on investment
(ROI) for improved IT infrastructure equipment
purchase. Unfortunately, despite the existing work
in progress, there is still no standard or widelyaccepted method of assessing assurance levels
associated with IT infrastructures, including
end-hosts, servers, applications, routers, firewalls,
and the network permitting the subsystems to
communicate. Clearly, this is an area where
academics and security experts need to focus to
find more effective trust solutions (Schell, 2007;
Seddigh, Pieda, Matrawy, Nandy, Lombadaris,
& Hatfield, 2004).

closIng
This chapter has discussed a number of privacy,
security, and trust issues affecting online consumers. Clearly, understanding these issues and finding solutions for themlegislative, technological,
sociological, or psychologicalis a complex chore
requiring experts in multiple fields, including law,
information technology, security, business, and
the social sciences. Likely the only way forward in
finding more comprehensive solutions is to adopt
a team approach for finding hybrid solutions that
are outside any one silo of expertise.

references
Associated Press. (2005). Business schools: Harvard to bar 119 applicants who hacked admissions
site. The Globe and Mail, March 9, B12.
Brenner, S. (2001). Is there such a thing as virtual
crime? Retrieved February 1, 2006,from http://
www.crime-research.org/library/Susan.htm
Cincu, J., & Richardson, R. (2006). Virus attacks
named leading culprit of financial lossby U.S.
companies in 2006 CSI/FBI computer crime and
security survey. Retrieved July 13, 2006, from
http://www.gocsi.com/press/20060712.jhtml
Cline, J. (2003). The ROI of privacy seals.
Retrieved October 12, 2007, fromhttp://www.
computerworld.com/developmentopics/websitemgmt/story/0,10801,81633,00.html
Cohoon, J. M., & Aspray, W. (2006). A critical
review of the research on womens
participation in postsecondary computing education. In J. M. Cohoon and W. Aspray (Ed.),
Women and information technology: Research on
underrepresentation (pp. 137-180). Cambridge,
MA: MIT Press.
Evans, M., & McKenna, B. (2000). Dragnet targets
Internet vandals. The Globe and Mail, February
10, A1, A10.
09

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Fitzgerald, M., & Corvin, A. (2001). Diagnosis and


differential diagnosis of Asperger syndrome. Advances in Psychiatric Treatment, 7(4), 310-318.
Flinn, S., & Lumsden, J. (2005). User perceptions
of privacy and security on the web. Retrieved 12
October, 2007, from http://www.lib.unb.ca/Texts/
PST/2005/pdf/flinn.pdf

Jonczy, J., & Haenni, R. (2005). Credential networks: A general model for distributed trust and
authenticity management. Retrieved on October
10, 2007, from http://www.lib.unb.ca/Texts/
PST/2005/pdf/jonczy.pdf
Jordan, T., & Taylor, P. (1998). A sociology of hackers. The Sociological Review, 46(4),757-780.

Fost, D. (2007). The technology chronicles: the


attack on Kathy Sierra. Retrieved March 27, 2007,
from http://www.sfgate.com/cgi-bin/blogs/sfgate/
detail?blogid=19&century_id=14783

Keller, L. S. (1988, July). Machismo and the


hacker mentality: some personal observations
and speculations. Paper presented to the WiC
(Women inComputing) Conference.

Furnell, S. (2002). Cybercrime: Vandalizing


the information society. Boston, MA: AddisonWesley.

Krebs, B. (2003). Hackers to face tougher sentences. Washington Post. Retrieved Feburary 4,
2004, from http://www.washingtonpost.com/ac2/
wp-dyn/A35261-2003Oct2?language=printer

Gilboa, W. (1996). Elites, lamers, narcs, and


whores: Exploring the computer underground. In.
L. Cherny & E. R. Weise (Eds.), Wired women:
Gender and new realities in cyberspace (pp. 98113). Seattle, WA: Seal Press.
Gordon, L. A., Loeb, M. P., Lucyshyn, W., &
Richardson, R. (2007). 2006 CSI/FBI computer
crime survey. Retrieved March 3, 2007, from
http://www.GoCSI.com
Grami, A., & Schell, B. (2004). Future trends
in mobile commerce: Service offerings, technological advances, and security challenges.
Retrieved October 13, 2004, from http://dev.hil.
unb.ca/Texts/PST/pdf/grami.pdf
Holt, T.J. (2006, June). Gender and hacking. Paper
presented at the CarolinaCon 06 Convention,
Raleigh, NC.
Holt, Thomas J. (2007). Subcultural evolution?
Examining the influence of on- and off-line experiences on deviant subcultures. Deviant behavior,
28(2), 171-198.
IBM Research. (2006). Global security analysis
lab: Fact sheet. IBM Research. Retrieved January
16, 2006, from http://domino.research.ibm.com/
comm/pr.nsf/pages/rsc.gsal.html

0

Levy, S. (1984). Hackers: Heroes of the computer


revolution. New York: Dell.
Meinel, C. (2006). Appendix A: How do hackers
break into computers? In B. Schell &
C. Martin (Ed.), Websters new world hacker
dictionary (pp. 373-380). Indianapolis, IN: Wiley
Publishing, Inc.
Meyer, G. R. (1989). The social organization of
the computer underworld. Retrieved October
13, 2007, from http://bak.spc.org/dms/archive/
hackorg.html
Nash, J. M. (2002). The geek syndrome. Retrieved
July 5, 2007, from http://www.time.com/time/covers/1101020506/scaspergers.html
Rescorla, E. (2005). Is finding security holes a good
idea? IEE Security and Privacy, 3(1), 14-19.
Reuters. (1999). Teen pleads guilty to government hacks. Retrieved October 13, 2007, from
http://scout.wisc.edu/Projects/PastProjects/netnews/99-11/99-11- 23/0007.html
Richardson, R. (2007). 2007 CSI Computer Crime
and Security Survey. Retrieved October 12, 2007,
from http://www.GoCSI.com

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Sagan, S. (2000). Hacker women are few but


strong. Retrieved August 5, 2004, from http://
abcnews.go.com/sections/tech/DailyNews/hackerwomen000602.html#top
Schell, B.H. (2007). Contemporary world issues:
The internet and society. Santa Barbara, CA:
ABC-CLIO.
Schell, Bernadette H. (1997). A self-diagnostic
approach to understanding organizational and
personal stressors: The C-O-P-E model for stress
reduction. Westport, CT: Quorum Books.
Schell, B. H., Dodge, J. L., & Moutsatos, S. S.
(2002). The hacking of America: Whos doing it,
why, and how. Westport, CT: Quorum Books.
Schell, B. H., & Lanteigne, N. (2000). Stalking,
harassment, and murder in the workplace: Guidelines for protection and prevention. Westport,
CT: Quorum Books.
Schell, B. H., & Martin, C. (2006). Websters new
world hacker dictionary. Indianapolis, IN: Wiley
Publishing, Inc.
Schell, Bernadette H.. & Martin, Clemens. (2004).
Contemporary world issues: Cybercrime. Santa
Barbara, CA: ABC- CLIO.
Seddigh, N., Pieda, P., Matrawy, A., Nandy, B.,
Lombadaris, J., & Hatfield, A. (2004). Current
trends and advances in information assurance
metrics. Retrieved October 10, 2007, from http://
dev.hil.unb.ca/Texts/PST/pdf/seddigh.pdf
Shaw, E., Post, J., & Ruby, K. (1999). Inside
the mind of the insider. Security Management,
43(12), 34-44.
Taylor, P. A. (2003). Maestros or misogynists?
Gender and the social construction of hacking. In
Y. Jewkes (Ed.), Dot.cons: Crime, deviance and
identity on the Internet (pp. 125-145). Portland,
OR: Willan Publishing.

digital terrorism. Upper Saddle River, NJ: Pearson


Prentice Hall.
Thomas, D. (2002). Hacker culture. Minneapolis,
MN: University of Minnesota Press.
Turkle, S. (1984). The second self: Computers
and the human spirit. New York:Simon and
Schuster.
Ullman, E. (1997). Close to the machine: Technophilia and its discontents. San Francisco: City
Lights Books.
Wajcman, J. (1991). Feminism confronts technology. University Park, PA: Pennsylvania State
University Press.
Wall, D. S. (2001). Cybercrimes and the Internet.
In D.S. Wall (Ed.), Crime and the Internet (pp.
1-17). New York: Routledge.
Walton, D. (2000). Hackers tough to prosecute, FBI
says. The Globe and Mail, February 10, B5.
Young, K. S. (1996). Psychology of computer use:
XL. Addictive use of the Internet: A case that
breaks the stereotype. Psychological Reports,
79(3), 899-902.

AddItIonAl reAdIngs
The Cybercrime Blackmarket. (2007). Retrieved
July 11, 2007, from http://www.symantec.com/avcenter/cybercrime/index_page5.html
Florio, E. (2005). When malware meets rootkits.
Retrieved July 11, 2007, from http://www.symantec.com/avcenter/reference/when.malware.
meets.rootkits.pdf
Grabowski, P., & Smith, R. (2001). Telecommunications fraud in the digital age: The convergence
of technologies. In D. S. Wall (Ed.), Crime and the
Internet (pp. 29-43). New York: Routledge.

Taylor, R. W., Caeti, T. J., Loper, D. K., Fritsch,


E. J., & Liederback, J. (2006). Digital crime and



Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

Herring, S. C. (2004). Computer-mediated discourse analysis: An approach to researching


online behavior. In S. A. Barab, R. Kling, J. H.
Gray (Eds.), Designing for virtual communities in
the service of learning (pp. 338-376). New York:
Cambridge University Press.
Holt, T. J. (2003). Examining a transnational problem: An analysis of computer crime victimization
in eight countries from 1999 to 2001. International
Journal of Comparative and Applied Criminal
Justice, 27(2), 199-220.
Holt, T. J., & Graves, D. C. (2007). A qualitative
analysis of advanced fee fraud schemes. The
International Journal of Cyber-Criminology,
1(1), 137-154.
The Honeynet Project. (2001). Know your enemy:
Learning about security threats. Boston, MA:
Addison-Wesley.
James, L. (2006). Trojans & botnets & malware,
oh my! Presentation at ShmooCon 2006. Retrieved July 11, 2007, from http://www.shmoocon.
org/2006/presentations.html
Kleen, L. J. (2001). Malicious hackers: A framework for analysis and case study. Masters Thesis, Air Force Institute of Technology. Retrieved
January 3, 2004, from http://www.iwar.org.uk/
iwar/resources/usaf/maxwell/students/2001/afitgor-ens-01m-09.pdf
Littman, J. (1997). The watchman: The twisted
life and crimes of serial hacker Kevin Poulsen.
New York: Little Brown.
Mann, D., & Sutton, M. (1998). Netcrime: More
change in the organization of thieving. British
Journal of Criminology, 38(2), 201-229.
Noblett, M. G., Pollitt, M. M., & Presley, L. A.
(2000). Recovering and examining computer
forensic evidence. Forensic Science Communications 2(4). Retrieved February 4, 2005, from
http://www.fbi.gov/hq/lab/fsc/backissu/oct2000/
computer.htm



Norman, P. (2001). Policing high tech crime


within the global context: The role of transnational
policy networks. In D. S. Wall (Ed.), Crime and the
Internet (pp. 184-194). New York: Routledge.
Ollmann, G. (2004). The phishing guide: understanding and preventing phishing attacks.
Retrieved July 11, 2007, from http://www.ngssoftware.com/papers/NISRWP-Phishing.pdf
Parizo, E. B. (2005). Busted: The inside story
of operation firewall. Retrieved July 9, 2007,
from http://searchsecurity.techtarget.com/
originalContent/0,289142,sid14_gci1146949,00.
html
Savona, E. U., & Mignone, M. (2004). The fox
and the hunters: How IC
technologies change the crime race. European
Journal on Criminal Policy and Research, 10(1),
3-26.
Schell, B. H. (2007). Contemporary world issues: The internet and society. Santa Barbara,
CA: ABC-CLIO.
Sterling, B. (1992). The hacker crackdown: Law
and disorder on the electronic frontier. New
York: Bantam.
Taylor, P. A. (1999). Hackers: Crime in the digital
sublime. London: Routledge.
Taylor, P. A. (2003). Maestros or misogynists?
Gender and the social construction of hacking. In
Y. Jewkes (Ed.), Dot.cons: Crime, deviance and
identity on the Internet (pp. 125-145). Portland,
OR: Willan Publishing.
Taylor, R. W., Caeti, T. J., Loper, D. K., Fritsch,
E. J., & Liederback, J. (2006). Digital crime and
digital terrorism. Upper Saddle River, NJ: Pearson
Prentice Hall.
Thomas, D., & Loader, B. D. (2000). Introductioncybercrime: law enforcement, security,
and surveillance in the information age. In D.
Thomas& B. D. Loader (Ed), Cybercrime: Law

Social/Behavioral Patterns of Computer Hacker Insiders and Outsiders

enforcement, security and surveillance in the information age (pp. 1-14). New York: Routledge.

P. Francis (Eds.), Invisible crimes. London:


Macmillan.

Thomas, R., & Martin, J. (2006). The underground


economy: Priceless. Login, 31(6), 7-6.

Wuest, C. (2005). Phishing in the middle of the


stream--todays threats to on-line banking. Retrieved July 11, 2007, from http://www.symantec.
com/avcenter/reference/phishing.in.the.middle.
of.the.stream.pdf

Wall, D. S. (1999). Cybercrimes: New wine, no


bottles? In P. Davies, V. Jupp, &





Chapter X

Privacy or Performance Matters


on the Internet:
Revisiting Privacy Toward a Situational
Paradigm
Chiung-wen (Julia) Hsu
National Cheng Chi University, Taiwan

AbstrAct
This chapter introduces a situational paradigm as a means of studying online privacy. It argues that
data subjects are not always opponent to data users. They judge contexts before disclosing information. This chapter proves it by examining online privacy concerns and practices with two contexts:
technology platforms and users motivations. It explores gratifications of online photo album users in
Taiwan, and finds the distinctive staging phenomenon under the theory of uses and gratifications,
and a priori theoretical framework, the spectacle/performance paradigm. The users with diffused
audience gratifications are concerned less about privacy but not disclose more of their information.
Furthermore, it finds that users act differently in diverse platforms, implying that studying Internet as a
whole is problematic. The author proposes that studying online privacy through the use of a situational
paradigm will help better research designs for studying privacy, and assist in understanding of users
behaviors among technology platforms.

IntroductIon
The common assumptions of the online privacy
concerns literature claim that net users who have

higher privacy concerns disclose less information


and that data subjects are always adversarial to
data users. Thus, researchers with these assumptions ignore online environments, take privacy

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Privacy or Performance Matters on the Internet

concerns as privacy practices, and follow the


off-line literature reviews to study what kind of
persons (demographical variables) are concerned
more about their privacy. This is called the adversarial paradigm, which does not take social
contexts into account (Hine & Eve, 1998; Hsu,
2006; Raab & Bennett, 1998).
What does go wrong for online privacy research with an adversarial paradigm? Researchers
fail to explain why users asserting to have higher
privacy concerns still disclose sensitive information and fail to verify why some claim that those
belonging to particular demographical variables
are concerned more about privacy, which is not
always the case in other research studies. Thus,
researchers instead have to find more social
contexts which are essential to users privacy
concerns and practices on the Internet as well as
study what makes users disclose more of their
informationthe so-called situational paradigm
(Hsu, 2006; Raab & Bennett, 1998).
In this study, the author tries to find more proofs
for the main argument of the situational paradigm,
in which assumption is human relativisma new
approach for examining online privacy especially
for the newly-emergence phenomena. What are
the newly-emerging phenomena on the Internet?
The author raises an example of online photo Web
sites. Online photo album Web sites were originally started for sharing digital memories with
friends and relatives. This trend is encouraged by
commercial online photo album Web sites which
provide free or fee spaces. In Taiwan, online
photo albums (usually with additional blog functions) are also popular among Internet users.
As a communication scholar, the author alleges that communication is a post-disciplinary
in which the rigid walls of disciplinarity are
replaced with bridges (Streeter, 1995). Online
privacy is such an interdisciplinary topic, whereby
communication could contribute with others.
Given that the Internet is a mass media (Morris
& Ogan, 1996), the author assumes that uses
and gratification theory may pave the way for a

situational paradigm in online privacy research.


The study suggests that media use is motivated
by needs and goals that are defined by audiences
themselves, and that active participation in the
communication process may assist, limit, or influence the gratifications associated with exposure.
Thus, different goals lead to diverse participation
and gratification. Current online privacy research
seldom takes users motivations of Internet
behaviors into account. How do these different
uses, motivations, and gratification influence their
online privacy concerns and privacy practices?
This is necessary subject to investigate.
In addition to normal usage of online photo
albums, there is a distinct staging or performing phenomenon in Taiwan. For example, I-Ren
Wang, a now-famous celebrity, was recruited as
an anchor by TVBS, a cable news network, due
to her incredible popularity on the largest online
photo album, Wretch (Jhu & Yung, 2006). Other
girls, such as Cutiecherry and Fing, were
invited to participate in noted TV programs and
turned into commercial stars.
For the majority of Internet users who have
not yet become celebrity, they may enjoy having a
reputation among users, getting on a popular list,
or being discussed on the online chat system and
BBS. It also seems that online photo album Web
sites have developed into a stage for those who
want to become stars and celebrities. This implies
that the motivations for some online photo album
(a new media use context) users are quite different
from net users in previous studies.
Online photo album users are more like diffused audiences, the concept from the spectacle/
performance paradigm (SPP) (Abercrombie &
Longhurst, 1998). Adopting the diffused audience
cycle into the online photo album context, some users are drenched with mediascapes, integrate what
they learned from the mediascapes into everyday
life, and perform them for users own visibility.
Others are drenched with mediascapes that facilitate discussions and help to attach themselves to
some idols. No matter what purpose users hold,



Privacy or Performance Matters on the Internet

after they achieve a certain level of narcissism,


they turn their attention to getting further media
drenching and performance.
Performing or staging on the Internet means
users usually disclose a large amount of their
personal information. Are those users aware of
possibly losing privacy and being transparent on
the Internet? Are those staging users concerned
less about their privacy or have they no idea of
the consequences of their behavior? There are two
possible results: one is that the staging users do
care about their privacy. Two is that they do not
care about privacy. The first outcome is parallel
to the situational arguments. The users concerns
do not reflect directly on their behaviors.
The second result is even more interesting,
because it is so different from previous studies
that claim that concerns over privacy protection
on the Internet or Web sites might become an
obstacle to the diffusion of the Internet and the
future growth of e-commerce (Culnan, 1999a,
1999b; Federal Trade Commission, 1998, 1999,
2000; Milne, Culnan, & Greene, 2006). Why do
they ignore their privacy? What are their bottom lines and their disclosing strategies? What
are the implications for the privacy studies and
online business?
Online photo album Web sites have a very
special characteristic in that visual anonymity
does not exist as with other Internet technology
platforms, because they provide more visual cues,
making visual anonymity impossible. This is another context worth studying. How does a visual
function on the Internet influence users privacy
concerns and practices? An investigation into this
question will help us better understand how users
perceive privacy while using a visual function
and how different users motivations for using a
visual function influence their privacy.
The staging phenomenon (observed by uses
and gratifications and the SPP theories) and the
Internet technology platform (online photo album
websites) are the two contexts developed in this
study to validate the situational paradigm. This



study attempts to prove that these two contexts do


influence users privacy concerns and practices
which also verify that human relativism is the
proper approach to study online privacy.

bAckground
studying privacy with the situational
paradigm
The major flaw of the current definition of privacy is that it assumes that people are vulnerable
without considering situations. Therefore, privacy
risks are always deemed to be dangerous (Raab
& Bennett, 1998). According to the previous discussion, studying the nature of privacy, privacy
risks, and privacy protection with the adversarial
paradigm means one is always coping with new
privacy infringement. As Moor (1997) puts it,
the privacy concept has been developed chronologically. In the current computer age, privacy
has become very informationally enriched.
As such, there should be an updated approach
studying privacy.
Moor, Raab, and Bennett separately study
the nature of privacy, privacy risks, and privacy
protection from an adversarial paradigm toward
a situational paradigm, especially for Internet
settings. However, little research is aware that
privacy concerns studies are still trapped on the
adversarial paradigm. Nowadays, online privacy
concerns studies mostly adopt the off-line literature review and try to find what kind of Internet
users care more about their privacy by means of
using demographics as independent variables
(Hoffman, Novak, & Peralta, 1999; Kate, 1998;
Milne & Rohm, 2000; ONeil, 2001; Sheehan,
2002). Nevertheless, the findings of the privacy
concerns literature focusing on demographics
usually are in conflict with each other. It implies
that privacy concerns are not static, but vary
with contexts.

Privacy or Performance Matters on the Internet

Contexts are not a breakthrough idea. Most


research studies are aware that understanding data
subjects demographic variables is not enough
to explain and predict data subjects. Contexts
also determine subjects privacy practices and
concerns. Hine and Eve (1998) raise the idea of
situated privacy concerns by examining different
situations qualitatively. They find that there is
no particular kind of information that is always
privacy sensitive in all kinds of contexts.
Unlike the adversarial paradigm, researching
privacy concerns with the situational paradigm
needs to take two things into consideration. One
is the context of privacy risks and data subjects.
The other is a fundamental problem with studying privacy concerns, whereby it is necessary
to distinguish privacy concerns and privacy
practices. When researchers ask about net users
online privacy concerns, people tend to think
back to serious privacy infringements that they
have experienced or heard, and thus rate their
privacy concerns higher than what they actually
practice on the Internet. However, when researchers ask about net users Internet usage regarding
online privacy, users daily practices show that
they might not be concerned about privacy as
seriously as they report (GVU, 1998; Culnan &
Armstrong, 1999).
This study will examine privacy concerns and
privacy practices with the contexts of data subjects
separately. Contexts might be technology (Sixsmith & Murray, 2001), Web sites performance
(Hsu, 2002; Culnan & Armstrong, 1999), privacy
regulations (Bennett, 1992), political system
(Plichtova & Brozmanova, 1997), culture/country
(In the adversarial paradigm, the country/culture
variable is usually taken to be demographics.
Privacy concern/practice research studies under
a situational paradigm revise the culture context
of subjects as a social context along with social
group), and so on.
As mentioned earlier, online photo album Web
sites are the new technology platform which should
be taken as a context. How do users have differ-

ent privacy concerns under different technology


platforms? In the early age of the Internet, there
were several technology platforms: MUD, BBS,
newsgroups, mailing lists, World Wide Web,
e-mail, instant messaging (IM), and so forth. In
e-mail posts and archives, Sixsmith and Murray
(2001) point out that due to the absence of visual,
aural, and other elements of face-to-face communication (i.e., list serves, bulletin board discussion
groups, and chat rooms), those e-mail posts and
archive users are less aware of their audience and
believe they are interacting with only a limited
circulation of subscribed, interested members. If
researchers analyzing their message content or
outsiders used the same words, they would feel
their privacy had been invaded. When people use
e-mail posts and archives, they might not care
about their online contact information which is
already listed on the list serve or discussion board.
However, they might be concerned more about the
message contents they have been posting.
Online photo albums and blogs are so popular
nowadays among users (Spector, 2003; Trammell,
Williams, Postelinicu & Landreville, 2006). The
unprecedented services provide opportunities
for users to show their own personal information, pictures, and visual clips on the Internet. It
means users make visual cues of themselves or
their acquaintances public on the Internet, which
might be identified by other users. Visual cues
are very much related to privacy. Some research
studies have done some anonymous investigation
into CMC (computer-mediated communication)
to find out the differences between visual and
discursive anonymity (Barreto & Ellemers, 2002;
Lea, Spears, & de Groot, 2001; Postmes, Spears,
Sakhel, & de Groot, 2001).
Visual anonymity refers to the lack of any
visual presentation of users like pictures or video
clips. Discursive anonymity is a bit more complex.
Users posting might reveal, to a certain degree
of information, about the message source. Anonymity may be viewed as fostering a sense of deindividuation, which sequentially contributes to



Privacy or Performance Matters on the Internet

self-disclosure (McKenna & Bargh, 2000).Visual


function, by contrast, may be viewed as fostering
a sense of individuation, which in turn contributes
to privacy concerns. However, this seems to be
a different situation for Taiwanese online photo
album users staging behaviors.
Under said circumstances, it is necessary to
examine if a visual function facilitates users
privacy concerns on the Internet and if different
motivations of using a visual function would influence users privacy concerns. In order to know
their motivation, this study first adopts the uses
and gratifications approach.

Internet Uses and Gratifications


As people integrate new technology into their
lives, they often do not view it as just an update
of the fixed function, but assign it special values.
The cell phone enhances the fixed-line telephone
as a means of social connectedness (Wei & Lo,
2006). The media literature has the same token.
Ever since the Internet has turned out to be so
popular, a number of researchers have taken it as
a mass medium (Morris & Ogan, 1996) and also
have started to find its users gratifications, but
its special values are often ignored, which will
be articulated later.
The uses and gratifications approach provides
an ample structure for studying new technology.
The approach basically assumes that individual
differences lead to each user seeking out diverse
medium and employing the media in a different
way. It is a user-centered perspective and implies that users are active in three ways: utility,
intentionality, and selectivity (Katz, Gurevitch,
& Hass, 1973; Katz, Blumler, & Gurevitch, 1974;
Palmgreen, Wenner, & Rosengren, 1985).
Some studies research the Internet as a whole,
while some explore specific forms of Internet
technology, such as homepages, search engines,
and BBS. They try to find users motivations
and identify new factors when new forms come
out. In summary, net users attempt to fulfill their

8

needs such as information seeking, interaction,


social avoidance, socialization, economic, entertainment, hobbies, relationship maintenance,
coolness, life convenience, and so on (Charney
& Greenberg, 2002; Hoffman & Novak, 1996;
Korgaonkar & Wolin, 1999; LaRose, Mastro, &
Eastin, 2001; Parker & Plank, 2000). The motivations for using both conventional media and the
Internet are essentially the same, but they are not
absolutely identical.
Song, Larose, Eastin, and Lin (2004) identify a
new gratification only for Internet settingsvirtual communitywhich suggests a new self-image
and social life on the Internet that has improved
on real life. In addition, self-image, self-identity,
and self-presentation are further recognized as
unique to the Internet by researchers (Dring,
2002). Net users have become information producers instead of consumers only while creating
their own personal homepages or blogs to disseminate and control individual information.
Net users establish a new self-image through the
Internet, which might be totally different from
their true ego (Dominick, 1999; OSullivan, 2000;
Papacharissi, 2002).
The motivation of using online photo albums
accounts for most of the mentioned cases. However, in the case of Taiwan, the uses and gratifications approach is not sufficient for explaining
online photo album users staging behaviors.
First, the traditional uses and gratifications approach is for media usage, which is so different
from Internet users behaviors today. As Song et
al. (2004) claim, an over-reliance on the set developed from television studies leads to a failure
to depict new gratifications.
Second, net users actively shape their presentation to interact and communicate with others adequately in everyday life in order for themselves to
perform in this performative society (Kershaw,
1994), which has not been elaborated, such as
staging or performing. This is the special value
that technology users often assign, but researchers
ignore when studying technology only with the
users and gratifications approach.

Privacy or Performance Matters on the Internet

As a result, this study suggests that studying net


users motivations must consider that users are not
only the audience of the uses and gratifications
approach, but also the diffused audience of the
spectacle performance paradigm. This part will
be used to construct a questionnaire to identify
new factors.

diffused Audiences
As Abercrombie and Longhurst (1998) argue, audience research has to take account of the changing
nature of audience and social processes, which
current research ignores. The common uses and
gratifications approach (and effects literature) is
categorized as a behavioral paradigm. According
to Hall (1980), there are a number of problems with
this paradigm, such as its lack of attention to power
relations, the textual nature of media message,
and understanding of social life. Halls critical
approach to the study of the media is categorized
as the incorporation/resistance paradigm (IRP),
as in how social structuring and social location
influence decoding of media texts. The key argument is the extent to which audiences resist or are
incorporated by media texts in ideological terms.
However, overemphasizing the coherence of the
response to different texts is problematic.
Abercrombie and Longhurst (1998) argue that
the spectacle/performance paradigm (SPP) is
much better in understanding the changing audience and conceptualizations of the audience and in
recognizing the audiences identity formation and
reformation in everyday life. They propose that
there are three different types of audiences: simple,
mass, and diffused. All of them co-exist.
The simple audience involves direct communication from performers to audience. The mass
audience reflects the more mediated forms of
communication. The diffused audience implies
that everyone becomes an audience all the time,
which entails people spending increasing amounts
of time in media consumption. Thus, the audience
interacts with the form of mediascapes, rather than

media messages or texts per se. Being a member


of an audience is a part of everyday life under two
social processes. First, the social world is more
of a spectacle nowadays. Second, the individual
is more narcissistic. Furthermore, the nature of
audience is changing and not static. In terms of
skills, audience could be identified as a continuum,
from the consumer, fan, cultist, enthusiast, to the
petty producer, in ascending order. Those who try
to make money by means of their online photo
album are more like petty producers.
Abercrombie and Longhurst offer a cycle to
explain the interaction of media and the diffused
audience. They take football as an instance. Four
processes form the kind of cycle. First, media
drenching: The audience increases football consumption in various mediums. Second, everyday
life: Media drenching facilitates interaction and
discussion, as well as emotions engaged. Third,
performance: The audience has more knowledge
about football, increases attachment to the favorite
team, and is identified as a football fan/follower.
Fourth, spectacle/narcissism: The audience desires
for increased visibility/knowledge as a basis for
performance and is also constructed by football,
and the display of logos, photos, and clothing. To
obtain more knowledge, the audience is drenched
in a massive amount of mediascapes again.
The SPP is proved adequate in empirical research. Longhurst, Bagnall, and Savage (2004)
use it to analyze the museum audience to connect
the ordinariness of museums as part of the audience processes of everyday life to wider themes
of spectacle and performance. Museum visiting
is proven to be a salient part of the identity of
the parent.
Online photo album users are not a typical
simple or mass audience. They are simultaneously information consumers and producers. They
continually get information from fused communication and a huge amount of mediascapes. They
could be also categorized as consumers, fans,
cultists, enthusiasts, and petty producers. Without considering the assumptions of the diffused

9

Privacy or Performance Matters on the Internet

audience and exploration under the SPP, online


photo album users are simply taken as the mass
audience. Their different behaviors are deemed
only as another gratification. However, the role of
information producer, the phenomenon of being
noted, and the constitutive of everyday life are
left unexplained.

mAIn thrust of the chApter


Issues, controversIes,
problems
research Questions and
hypotheses
Based on the review of privacy, major gratifications of the Internet, and diffused audiences
from the SPP, we find that privacy concerns and
privacy practices are situated, which cannot be
examined without considering the contexts. In this
study, the contexts are users gratifications and
visual cues. Research questions and hypotheses
are listed below.
RQ1: What is the relationship between privacy
concerns and practices?
H1a: The more they disclose their information,
the less privacy concerns they hold.
H1b: The more they disclose their visual cues, the
less privacy concerns they hold.
H1c: The more they disclose their visual cues
(post more pictures), the more information
they disclose.
RQ2: What is the difference on privacy concerns
between online photo album haves and
have-nots?
H2: Those who have online photo albums are
concerned less about their privacy.
RQ3: What is the difference on privacy practices
between online photo album haves and
have-nots?
H3: Those who have online photo albums
disclose more information.

0

RQ4: What are online photo album users gratifications?


RQ5: What is the relationship between gratifications and privacy concerns?
H4: Those whose gratifications are more like
diffused audiences are concerned less about
privacy.
RQ6: What is the relationship between gratifications and posting pictures?
H5: Those whose gratifications are more like
diffused audiences post more pictures on
the Internet.
RQ7: What is the relationship between gratifications and privacy practices?
H6: Those whose gratifications are more like
diffused audiences disclose more of their
information.

data collection
The survey sample is drawn from volunteers, who
were recruited from the biggest online photo album
Web site in Taiwan: Wretch (http://www.wretch.
cc). The advertisement for recruiting volunteers
was put on the homepage with a hyperlink to
a Web survey interface. A sample of 893 users
participated in this Web survey. The sample
consisted of 91.3% users (815) who have at least
one online photo album and 8.6% users who do
not have any (77). The average number of hours
on the Internet per week was reported to be 5.64
hours. The most frequent visiting album types
are relatives (56.2%), gorgeous persons (48.0%),
celebrities (19.4%), and others.

Instrument construction
This study draws on motives identified in previous studies on uses and gratifications from mass
media and the Internet, adds the virtual community factor found by Song and his colleagues,
and develops new items based upon Abercrombie
and Longhursts diffused audience cycle. The
diffused audience questions are developed by

Privacy or Performance Matters on the Internet

conducting an in-depth interview with 10 users.


Different levels of users are chosen in terms of
the audience skills. The wordings were slightly
changed to fit the online photo album context.
After a pilot study, this study uses a list of 49
gratification items with Likert scales from 1 to
5. This study revises the privacy concerns questionnaire adopted from previous research (Smith,
Milberg, & Burke, 1996).
This study also would like to see online photo
album users privacy practices, which is how often they disclose their information and pictures.
There are two questions about pictures: clear
picture of you and clear picture of friends and
relatives. The scopes of personal information in
this study are revised from the most frequently
asked information lists done by GVU (1998) and
sensitive information which are emphasized by
privacy advocates, including demographic information, contact information (address, telephone
numbers), online contact information (email
address), other family members information,
and details of everyday life, such as where to go,
where to diet, and so forth.

data Analysis
The results are listed, followed by the orders of
research questions and hypotheses. All analyses
were done using the SPSS 14.0 statistical program.
Principal component solution and varimax rotation
were adopted to find gratification groupings.
This chapter selects items with a major
loading at 0.5 or higher and a secondary loading no less than 0.40. Each factor is extracted
with eigenvalues greater than 1.0 and minimum
reliability more than 0.60. All hypotheses were
tested by Pearson product-moment correlations.
A regression analysis was employed to evaluate the relationship between privacy concerns
(dependent variable) and Internet gratifications
(independent variables).

results
RQ1: What is the relationship between privacy
concerns and practices?
H1a: The more they disclose their information,
the less privacy concerns they hold.
H1b: The more they disclose their visual cues, the
less privacy concerns they hold.
H1c: The more they disclose their visual cues
(post more pictures), the more information
they disclose.
It seems that the respondents are not concerned
a lot about privacy as the means are all fewer
than 2 (see Table 1). However, the respondents
concern about their privacy is very different as
the standard deviations are all higher than .600,
especially questions 1, 5, 7, and 12, in which
their standard deviations are over .800. In order
to know the details, all 12 questions added as
general privacy concerns, are further analyzed
with gratifications and other variables.
The correlation of frequency of information
disclosing and general privacy concerns shows
some critical points. First, the privacy practices
are not parallel with their privacy concerns. The
coefficients are quite weak and contradictory (see
Table 2). Those who post more clear pictures of
themselves (-.130**) or clear pictures of their
friends or relatives (-.124**) hold less privacy
concerns. Second, users who usually disclose
visual cues on the Internet are concerned less
about privacy. However, despite other family
members information, contact information, and
demographic information being somehow more
sensitive than everyday life and online contact
information, respondents with higher privacy
concerns unexpectedly disclose more of the
sensitive types of information. In order to make
clear the contradiction, it is necessary to see the
differences between users who have online photo
albums and those who do not.



Privacy or Performance Matters on the Internet

Table 1. Descriptives statistics of privacy concerns


Privacy Concerns

SD

1.

It usually bothers me when a Web site asks me for personal information.

1.85

.887

2.

When a Web site asks me for personal information, I sometimes think twice before providing it.

1.77

.789

3.

Web sites should take more steps to make sure that unauthorized people cannot access personal
information in their computers.

1.48

.687

4.

When people give personal information to a Web site for some reason, the Web site should never
use the information for any other reason.

1.34

.637

5.

Web sites should have better procedures to correct errors in personal information.

1.69

.815

6.

Computer databases that contain personal information should be protected from unauthorized
access, no matter how much it costs.

1.35

.638

7.

Some Web sites ask me to register or submit personal information before using them. It bothers
me to give my personal information to so many websites.

1.82

.910

8.

Web sites should ever sell the personal information in their databases to other companies.

1.26

.605

9.

Web sites should never share personal information with other Web sites unless it has been
authorized by the individual who provided the information.

1.31

.627

10.

I am concerned that Web sites are collecting too much personal information about me.

1.60

.799

11.

The best way to protect personal privacy on the Internet would be through strong laws.

1.50

.781

12.

The best way to protect personal privacy on the Internet would be through corporate policies,
which the corporations develop themselves.

1.63

.908

RQ2: What is the difference on privacy concerns


between online photo album haves and
have-nots?
H2: Those who have online photo albums are
concerned less about their privacy.
RQ3: What is the difference on privacy practices
between online photo album haves and
have-nots?
Table 2. Correlations of privacy practices and
general privacy concerns
General Privacy
Concerns
Clear pictures of you

-.130**

Clear pictures of your friends or relatives

-.124**

Demographic information

.126**

Contact information

.183***

Online contact information


Other family members information
Everyday life

.080*
.199***
.028

*** Correlation is significant at the 0.001 level (2-tailed).


** Correlation is significant at the 0.01 level (2-tailed).
* Correlation is significant at the 0.05 level (2-tailed).



H3: Those who have online photo albums disclose


more information.
In order to answer Hypotheses 2 and 3, the
t test is employed to see if users with online
photo albums have less privacy concerns and
more privacy practices (see Table 3). Those who
have online photo albums perceive less privacy
concerns than the have-nots. This parallels with
Hypothesis 2. As for privacy practices, those
who have online photo albums do disclose more
clear pictures of themselves and clear pictures
of their friends or relatives than the have-nots,
which is not surprising. What is unexpected is
those who have no online photo albums disclose
more sensitive information, such as demographic
information, contact information, and other family members information, than those who have
online photo albums.
The findings show that privacy concerns do not
always reflect upon users practices. Taking the
contexts into consideration, online photo album
users post many pictures online, but they might

Privacy or Performance Matters on the Internet

Table 3. Effect of having online photo album or not on privacy concerns and information disclosure
Online
photo
album

Mean

SD

Mean
difference

General privacy concerns

YES

702

1.53

.463

-4.544a

-.31

NO

63

1.84

.873

Clear pictures of you

YES

746

3.72

1.253

4.526a

1.724

(PCU)

NO

11

2.00

1.342

Clear pictures of your


friends or relatives

YES

744

3.63

1.256

3.549a

1.360

-2.109c

-.786

-4.445a

-1.309

-.093

-.040

-4.475a

-1.210

.916

.330

(PFR)

NO

11

2.27

1.618

Demographic information

YES

734

2.21

1.159

(DEM)

NO

10

3.00

1.886

Contact information

YES

741

1.49

.912

(CON)

NO

10

2.80

1.687

Online contact
information

YES

741

2.66

1.348

(OCO)

NO

10

2.70

1.767

Other family members


information

YES

743

1.49

.832

(FAM)

NO

10

2.70

1.767

Everyday life

YES

743

3.53

1.124

(LIF)

NO

10

3.20

1.687

a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed). c Correlation is
significant at the 0.05 level (2-tailed).

hold back the very sensitive information to prevent


cyber stalking and other consequences. That is
why insensitive information, such as online contact information and everyday life, does not show
any significant differences between the haves and
have-nots. Apparently, there is no harm to know
the visual cues and insensitive information. The
results show that the first contextthe technology platformsproves that privacy practices and
concerns are situated.
RQ4: What are online photo album users gratifications?
This research extracts 10 factors with eigenvalues above 1.0 accounting for 67.39% of
total variance from the original set of Internet

gratifications. The first gratification, named as


information-seeking, contains seven items ( =
0.89), including learning about local community
events, getting useful information, finding bargains, and getting up to date with new technology (see Table 4). Factor two is characterized as
media drenching ( = 0.88). This factor indicates
that respondents increase their usages of online
photo albums and get updates from some of their
favorite albums.
The third gratification, named as diversion (
= 0.85), results from the pleasurable experience
of content. The fourth and fifth gratifications,
named respectively as performance and narcissism, are unprecedented in prior research. The
factor performance ( = 0.83) refers to users
media drenching facilitating a discussion to



Privacy or Performance Matters on the Internet

particular persons or things. Users then perform


their identity of fans/followers and also show their
attachment and knowledge. The factor narcissism
( = 0.82) points out that users desire for increased
visibility/knowledge as a basis for performance
and show their special identities, not only on the
Internet, but also outside the Internet.
The sixth and seventh gratifications are relationship maintenance ( = 0.90) and aesthetic experience ( = 0.88). Those findings are comparable
to those of Song et al. Unlike virtual community,
the factor relationship maintenance focuses on
existing acquaintances, not new friends on the
Internet. Aesthetic experience fits the needs of
aesthetic pleasure. The eighth gratification is
virtual community ( = 0.82), which is similar to
the finding of Song et al. Users try to establish a
new social life online.
The ninth and tenth gratifications are named
as function ( = 0.60) and reference ( = 0.82).
These two are unprecedented. Saving files and
finding albums that are easy to navigate are goaloriented examples. Users take album owners

released information as a reference and plan to do


the same thing. They do not pursue pleasurable
experiences only, but put what they learn into
practice in the real world.
Which gratifications are more like ones that
diffused audiences may have? According to
Abercrombie and Longhurst (1998), narcissism,
performance and media drenching undoubtedly
could be categorized as diffused audiences
gratifications. Virtual community in name only,
seems to have no connection with a diffused
audience. If we look into the meaning, we find
out that the purpose of virtual community is in
finding companionship and meeting new friends
and the way of finding companionship is through
establishing special identities to appeal to new
friends (Papacharissi, 2004; Song et al., 2004).
This is quite like the argument of the SPP, whereby
audiences form their identities to perform and to
be narcissistic.
Respondents show 10 gratifications while
using online photo albums in this study. How do
the gratifications influence users privacy con-

Table 4. Online photo album gratification factor loadings


Loading Eigenvalue Variance

Factor 1: Information seeking (IS)


Learn about local community events

.547

Get fashion information

.738

Get travel information

.851

Get gourmet food information

.860

Get useful information about products or services

.837

Find bargains on product and services

.513

Get up to date with new technology

.587

5.31

10.83

0.89

4.03

8.232

0.88

Factor 2: Media drenching (MD)


Spend lots of time checking online photo albums without awareness

.730

Checking albums is a part of my life

.757

Have my favorite albums

.787

Check if those albums are updated regularly

.794

Expect new photos from people I like

.705

Factor 3: Diversion (DV)


Have fun

.575

continued on following page



Privacy or Performance Matters on the Internet

Table 4. continued
Feel excited

.520

Feel entertained

.754

Feel relaxed

.738

Kill time

.655

3.57

7.292

0.85

3.52

7.190

0.83

3.52

7.179

0.82

3.47

7.088

0.90

3.21

6.559

.88

3.01

6.153

.82

1.75

3.565

.59

1.62

3.301

.82

Factor 4: Performance (PF)


Download favorite persons pictures

.568

Discuss particular persons from albums with friends

.576

Discuss particular persons from albums in BBS

.711

Find relative information of particular persons from albums

.644

Let others know I am a fan of particular persons, from albums

.686

Enjoy browsing online photo album

.608

Factor 5: Narcissism (NA)


Develop a romantic relationship

.574

Get people to think I am cool

.641

Improve my standing in the world

.700

Feel like I belong to a group

.507

Find ways to make more money

.545

Get noted

.610

Factor 6: Relationship maintenance (RM)


Get in touch with people I know

.744

Keep posted about relatives and friends

.879

Want to see relatives and friends recent pictures

.885

Want to see relatives and friends social life

.798

Factor 7: Aesthetic experience (AE)


Find cool new albums

.645

See albums with pleasing designs

.659

Find some gorgeous persons pictures

.636

Find attractive graphics

.660

See albums with pleasing color schemes

.568

Factor 8: Virtual community (VC)


Find more interesting people than in real life

.637

Meet someone in person who I met on Internet

.535

Find companionship

.755

Meet new friends

.674

Factor 9: Function (FN)


Save picture files

.777

Find albums that are easy to navigate

.519

Factor 10: Reference (RE)


Check where particular persons from albums have been

.600

Go to the same place where particular persons from albums went

.719

Total variance explained = 67.39%



Privacy or Performance Matters on the Internet

cerns and practices? Do the diffused audiences


gratifications make any differences from other
gratifications? Do respondents with diffused
audiences gratifications care less about privacy
and disclose more information or anything else?
This is articulated in next section.
RQ5: What is the relationship between gratifications and privacy concerns?
H4: Those whose gratifications are more like diffused audiences, are concerned less about
privacy.
A correlation coefficient may be difficult to
assess just by examining the coefficient itself.
In this research, the correlation between factors,
privacy concerns, and general privacy concerns
is considered low (see Table 5). Nevertheless, the
researcher should consider the situation when
interpreting correlation coefficient sizes. Jaeger
points out (1990, cited from Reinard, 2006):
whether a correlation coefficient is large or small,
you have to know what is typical. Thus, the study
pays more attention to a comparison of correlation
coefficients between factors and privacy concerns
in order to see how different gratifications influence respondents privacy concerns.
The descending order of correlation coefficients between gratifications and general privacy
concerns is diversion, relationship maintenance,
aesthetic experience, reference, media drenching,
function, information seeking, virtual community,
performance, and narcissism. All are significant,
but if we look at each question deeply, users with
different gratifications do have diverse aspects of
privacy concerns.
There are six gratifications which do not correlate with Q1 significantly. The coefficients of
media drenching, diversion, relationship maintenance, and reference are significant. Among
12 questions, Q1 and Q7 (Q7 will be discussed
later) seem not to irritate users privacy concerns
as much as others. Why do some respondents not
feel bothersome when a website asks them for



personal information? This context should be


considered. The respondents with information
seeking, performance, narcissism, aesthetic experience, virtual community, and function have no
choices, but register sites do provide information
in order to use services. Therefore, these six correlation coefficients are insignificant.
As for Q2, respondents with performance,
narcissism, and virtual community gratifications
seem not to think twice before providing personal
informationwhen a Web site asks me for it.
Unlike with Q2, virtual community gratification
has a significant correlation with Q3. Why do
respondents with virtual community gratification think that Web sites should take more steps
to make sure that unauthorized people cannot
access personal information in their computers?
Net users usually have several identities on several
communities. Without preventing unauthorized
access, users would be easier recognized by
cross-referencing. Performance, narcissism, and
function have an insignificant correlation with Q3.
Users with these gratifications do care less about
unauthorized people access information.
Information seeking, performance, narcissism,
virtual community, and function do not correlate
with Q4. Those with these gratifications do not
worry about their personal information being used
for any other reason by the Web sites which they
give information. In addition, all gratifications
correlate to Q5, which means the respondents
all care about data error. As for Q6, only respondents with performance, narcissism, and virtual
community care less about the database being
protected from unauthorized accessno matter
how much it costs (no significant correlation).
Comparing Q3 and Q6, it seems that users with
virtual community only care about unauthorized
people accessing information and do not care about
protecting databases in a broad sense.
The correlation of Question 7 and gratifications
is worth deliberating. Only four gratifications,
diversion, relationship maintenance, aesthetic
experience and reference, have a significant

Privacy or Performance Matters on the Internet

usually have a higher correlation coefficient. It


might be the reason that users with these two
gratifications, to some extent, have to count on
the Web sites services to fulfill the tasks. By
the same token, information seeking and function do not correlate with Q1, Q4, Q7, and Q9
significantly.
Although all gratifications significantly correlate with Q10, users with relationship maintenance, function, reference, and division worry
more about information being collected. On the
other hand, respondents with narcissism, virtual
community, and performance gratifications are
willing to expose themselves to the Internet. Correspondingly, they are not concerned that websites
are collecting too much personal information
about them.
As for Q11 (protecting online privacy through
strong laws), performance and narcissism do
not have a significant correlation with it, but 10
gratifications all significantly correlate with Q12
(through corporate policies). In general, respondents who have privacy concerns in descending
order usually use online photo albums with diver-

correlation with Q7. Taking research situations


into consideration, when users use online photo
albums for diversion, aesthetic experience, and
reference, they usually do the activities alone
and do not prepare to have any interaction with
others online. That is why they feel particularly
bothersome when websites ask them to register or
submit personal information before using them.
As for relationship maintenance, this gratification
could be seen as an in-group need. Users tend to
have interaction with their acquaintances online.
Following Taylor, Franke, and Maynards (2000)
argument, people with more in-group privacy are
more sensitive to differentiating between people
inside and outside their private world.
Q8data transferring seems not a big deal for
respondents with narcissism, performance, and
virtual community, and this makes sense under
the SPP paradigm. Those who are more interested
in being identified and noted are not concerned
about data transferring, which ironically might
satisfy their needs to catch more peoples eyes.
Interestingly, Q9 does not correlate significantly
with information seeking and function, which

Table 5. Correlation between factors and privacy concerns


IS

MD
c

DV

PF

NA

.127

.031

RM

AE

VC

FN

RE

.003

.096

.061

-.030

.021

.080c

1.

-.001

.082

2.

.116b

.127a

.137a

.066

.049

.142a

.136a

.061

.071c

.123a

3.

.093b

.131a

.205a

.060

.049

.170a

.173a

.097b

.063

.154a

4.

.050

.137

.204

.037

.009

.203

.138

.036

.063

.102b

5.

.139a

.218a

.223a

.177a

.124b

.164a

.229a

.172a

.201a

.200a

6.

.089

.141

.169

.050

.008

.179

.169

.060

.088

.128a

7.

.039

.054

.160a

.040

.037

.089c

.103b

-.013

.057

.084c

8.

.093b

.153a

.198a

.056

.016

.214a

.166a

.064

.089c

.126a

9.

.019

.115

.164

.040

-.027

.186

.144

.016

.064

.079c

10.

.119b

.147a

.153a

.093b

.108b

.177a

.145a

.108b

.167c

.164a

11.

.132a

.168a

.158a

.029

.053

.170a

.168a

.071c

.186a

.165a

12.

.169

Gen.

.127b

.163

.177

.101

.139

.147

.163

.130

.180

.157a

.195a

.247a

.107b

.083c

.236a

.219b

.109b

.156a

.206a

a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed).c Correlation is
significant at the 0.05 level (2-tailed).



Privacy or Performance Matters on the Internet

variables. The 10 gratifications account for 12.0%


of the variance in privacy concerns.
The regression analysis with the stepwise
method demonstrates more details of these three
significant gratifications (see Table 7). The strongest variable is relationship maintenance, and then
diversion and reference. The three gratifications
account for 10.4% of the variance in privacy
concerns. It explains that uses and gratifications
should be an important predictor for privacy
concerns, which is unprecedented.

sion, relation maintenance, aesthetic experience,


reference, media drenching, function, information
seeking, virtual community, performance, and
narcissism gratifications.
Hypothesis H4 is sustained. Those whose gratifications are more like diffused audiences (performance, narcissism, and virtual community) are
concerned less about privacy. However, media
drenching seems to be not a gratification which
can be categorized into the diffused audiences
gratifications with performance, narcissism, and
virtual community. It only has an insignificant
correlation with Q7.
How much do those surveyed gratifications
influence privacy concerns? The regression
analysis with the enter method (see Table 6) illustrates that information seeking, performance,
narcissism, and virtual community are all negative
and insignificant predictors of privacy concerns,
which corresponds with the SPP argument of this
study. Media drenching, aesthetic experience, and
function are positive, but not significant predictors.
Diversion, relationship maintenance, and reference are all positive and significant independent

RQ6: What is the relationship between gratifications and posting pictures?


H5: Those whose gratifications are more like
diffused audiences post more pictures on
the Internet.
RQ7: What is the relationship between gratifications and privacy practices?
H6: Those whose gratifications are more like
diffused audiences disclose more of their
information.

Table 6. Regressoion analysis with the enter Table 7. Regressoion analysis with the stepwise method
method
Model
1

Beta

-1.621

.064

1.247

.192

3.464

Performance

-.072

-1.419

Narcissism

-.050

-.934

Diversion

.134

3.097b

Aesthetic
Experience

.094

1.701

Virtual
Community

-.032

-.581

Reference

8

Model

8.886a

.120

Beta
(Constant)
Relationship
Maintenance

Relationship
Maintenance

Function

R2

22.850

46.303

R2

10.707
-.085

Media Drenching

F
a

(Constant)
Information
Seeking

.256

(Constant)

.066
.066

6.805 a
14.341
a

Relationship
Maintenance

.191

4.799 a

Diversion

.180

4.527 a

(Constant)

34.086

.104

.151

3.584 a
a

.661

Diversion

.159

3.952

.119

Reference

.114

2.760 a

.028

.010

Relationship
Maintenance

.094

12.263

.029

2.535

R2
change

25.492
a

Privacy or Performance Matters on the Internet

In order to inspect RQ6, RQ7, H5, and H6, this


research adopts the Pearson correlation analysis
(see Table 8). For those whose gratifications are
more like diffused audiences, including performance, narcissism, and virtual community,
there is no significant correlation between these
three gratifications and posting clear pictures
of themselves on the Internet. For those whose
gratifications are media drenching, diversion,
relationship maintenance, and reference, there is
a significant negative correlation, which means
they tend not to post clear pictures of themselves.
Those four gratifications are so private that users will usually lurk and not be known. There
is also no significant correlation between these
three gratifications (performance, narcissism, and
virtual community) and posting clear pictures of
friends and relatives on the Internet. In addition,
there is a significant negative correlation between
these three gratifications (media drenching,
relationship maintenance, and reference) and
posting clear pictures of friends and relatives on
the Internet.
This study cannot directly prove that those
whose gratifications are more like diffused audiences will post more pictures on the Internet.
However, it does find out that those whose gratifications are more private disclose fewer pictures
of themselves, friends, and relatives. Thus, H5
is rejected and should be revised as those whose
gratifications are more private post fewer pictures
on the Internet.
Privacy practices, interestingly enough, have
a significant correlation with diffused audience
gratifications, but conflicting with Hypothesis
6, those who have performance, narcissism, and
virtual community gratifications do not disclose
more. On the contrary, narcissism gratification has
a significant negative correlation with disclosing
contact information, online contact information,
other family members information, and everyday
life. Performance gratification has a significant
negative correlation with online contact information, other family members information, and

everyday life. Virtual community gratification has


a significant negative correlation with disclosing
contact information, online contact information,
and other family members information.
If we look more into the contexts, it makes
senses again. There is a difference between
performance and narcissism, although they are
grouped as diffused audience gratifications. Those
who have performance gratification are cultists,
enthusiasts, or petty producers for some particular
themes or persons. They are willing to disclose
information to others who are not as professional
as they are. They even enjoy being a pro in the
field. People who have narcissism gratification
are not crazy for a particular theme or person.
Their performance subjects are themselves.
Thus, they already post a lot of information and
hesitate to give away contact information, online
contact information, and other family members
information, which might ruin their identities
created online.
For performance, they see online contact information as being more sensitive than contact
information. This is so different from previous
research. Although people with performance
gratification are willing to be famous due to their
professions, do they really feel no harm at being
reached in the real word? Why do they hold back
their online contact information instead of contact
information? The reason might be homepages,
blogs, and those self-disclosure tools on the Internet are categorized into online contact information, which might reveal more about themselves.
Additionally, users are known to some extent as
having either performance or narcissism gratifications. Thus, they reserve other family members
information and everyday life.
As for virtual community, those who want to
meet people online hold back their information in
real life. Therefore, they do not want to disclose
their contact information, online contact information, and other family members information,
which would be traced back to their attribution. As
long as they remain anonymous in their real life,

9

Privacy or Performance Matters on the Internet

it would be fine for them to give away information


about their everyday life.
This study cannot directly prove that those
whose gratifications are more like diffused audiences disclose more on the Internet. Thus, H6 is
rejected, but there is more worth examining. In
contrast, this study finds that users with performance, narcissism, and virtual community have
some particular consideration due to their disclosing many pictures. As for correlation of other
gratifications and privacy practices, information
seeking, diversion, aesthetic experience, and function have a significant negative correlation with
online contact information. Nevertheless, there
is no significant correlation of any gratification
with contact information.
Why does the insensitive type of information
have a significant correlation with more private
gratifications? The reason might be that those with
more private gratifications take securing sensitive
information online for granted and have never
thought about giving it away, which makes the
statistic results irrelevant. However, they do think
twice about the consequences of giving away the
insensitive information. Take relationship maintenance as an example. Demography and everyday
life have a significant negative correlation. Why
not other information types? If we consider the
in-group privacy characteristics of relationship
maintenance, the answer is obvious. In-group
members do not want to share information about
this group to others.
Media drenching gratification only has a sig-

nificant negative correlation with everyday life.


We do not have a clear idea why media drenching has no correlation with others. It is logical to
presume that media drenching users love to show
their expertise about some celebrities or things,
but they may not love to be known about how s/he
did it. Everyday life is somehow sensitive for media drenching users. Indeed, the simple statistics
cannot tell more about the relationship between
them, but they reveal that contexts do play very
important roles on users privacy practices.

solutions and recommendations


Each of the significant relationships is provocative
and invites further thinking and research on the
associations of privacy and contexts which the
adversarial paradigm neglects. Although users
who have online photo albums are concerned less
about privacy than the have-nots in general, it
does not always mean the haves will disclose any
information more than the have-nots. This might
come about because of two important contexts.
The first context is the common operations of
online photo albums. Online photo album users
post many pictures online, and they might hold
back the very sensitive information to prevent
cyber stalking and other consequences.
The second context is from patrons gratifications. Based upon the interpretation, we have
seen how users gratifications of using a technology platform impact their privacy concerns

Table 8. Correlation between factors and privacy practices


IS

DV
a

-.076

PCU

.024

-.198

PFR

.050

-.153 a

DEM

.033

-.051

PF
c

NA

RM

AE

VC

FN

RE

-.045

.061

-.065

-.179 a

-.062

-.126 b

-.031

-.065

-.023

-.022

-.314

.007

-.018

-.004

-.272 a

.006

.090

-.004

-.062

.001

-.082 c

-.004

.007

CON

-.045

.058

-.005

-.024

-.112

.072

.004

-.078

-.038

.001

OCO

-.086 c

-.015

-.143 a

-.103 b

-.151 a

-.020

-.104 b

-.179 a

-.075 c

-.024

FAM

-.079 c

-.020

.011

-.083 c

-.156 a

.057

.020

-.101b

-.033

-.006

-.046

-.062

-.028

-.163 a

LIF

0

MD

-.019

-.164

-.088

-.075

-.077

-.178

Privacy or Performance Matters on the Internet

and practices. H5 is rejected. It instead finds out


that those whose gratifications are more private
disclose fewer pictures of themselves, friends,
and relatives. H6 is also rejected. For those who
already post many pictures, or are more diffused
audience-oriented, might think about possible
consequences and hold back information.
As for the relationships between privacy concerns and gratifications, the descending order of
correlation coefficients between gratifications
and privacy concerns is diversion, relationship
maintenance, aesthetic experience, reference,
media drenching, function, information seeking,
virtual community, performance, and narcissism.
From the standpoint of uses and gratifications
and the SPP, this studys findings highlight the
idea that the gratifications are categorized as
three groups.
The first group includes performance, narcissism, and virtual community. Respondents who
use online photo albums with these three gratifications like expose themselves to the Internet world
in order to meet people and even become noted.
The second group consists of information seeking,
function, and media drenching. Respondents with
these three gratifications are goal-oriented. They
adopt online photo albums for some purposes, such
as seeking information, storing files, and checking
their favorite albums. They do not like to meet
people or become famous. The third group comprises diversion, relation maintenance, aesthetic
experience, and reference, or four gratifications.
They are not as goal-oriented as respondents with
the second groups gratifications. Their purposes
are more like seeking personal pleasure which is
usually kept secret or sharing with some in-group
persons only.
Media drenching originally categorized as
diffused audiences gratifications is an interesting
one. Users with this one do not like to put clear
picture of themselves, relatives, and friends on
the Internet. However, it only has a significant
negative correlation with everyday life (privacy
practices) and a non-significant correlation with

Q7 (privacy concerns). By looking at the questions


composing the factor, the context influences are
revealed. Media drenching users enjoy being a pro
on their favorite field and make efforts to maintain
their expertise, but they do want to reveal their
efforts and resources. Therefore, media drenching instead is categorized into the second group,
which often does things secretly.
The diverse aspects of privacy concerns
caused by different gratifications do show up in
this studys findings. For example: Information
seeking does not have a significant correlation
with Q1, Q4, Q7, and Q9. Those questions relate
to collection and unauthorized secondary use
(Smith et al., 1996). This study finds that users
with information seeking are not concerned more
about information being used by other Web sites,
but are concerned about their information being
used by unauthorized individuals. The message
here is that users who want to use Web site services to seek information should scarify their
privacy in some sense, but it does not mean that
being accessed by unauthorized individuals is
endurable.
Function does not have a significant correlation
with Q1, Q3, Q4, Q7, and Q9 either. Those questions relate to the improper use of information and
collection which Hsus (2006) study defines. As
long as Web sites protect data well, users with the
function gratification do not worry about collection and secondary use. They need the safe spaces
for file storage more than have privacy.
The more diffused audience gratifications,
including performance, narcissism, and virtual
community, only have a significant correlation
with Q5, Q10, and Q12. It reveals that users with
these three gratifications only care about data
error, collecting too much personal information about them, and protecting privacy through
corporate policies. They do not care about Web
sites registration collection, secondary uses, or
unauthorized access (except virtual community).
Users with virtual communication have two more
aspect of privacy concernsunauthorized access



Privacy or Performance Matters on the Internet

and protecting privacy through strong laws. Accordingly, it is quite reasonable that users manage
pure online relationships with special cautions of
being cheated or their identity in the real world
being divulged.
This in turn begs the question: What gratification might entail users to care more about their
privacy? By looking at both correlation and regression analyses, relation maintenance, diversion, and
reference gratifications predict privacy concerns
with a 10.4% variance. Three gratifications are
gained from using online photo albums secretly
or sharing with some in-group persons. Thus,
they care about collection, data error, unauthorized access, and secondary uses, and hope that
privacy can be protected by both strong laws and
corporate policy. As for other gratifications, they
are not significant predictors at all. Moreover, relation maintenance raises the interesting challenge
to understanding the international/intercultural
differences of privacy (Taylor et al., 2000). Does
the in-group privacy phenomenon only happen
in Taiwan or Asian countries? Could it be applied
to other countries and cultures?
By doing this research, the author tries to prove
that current online privacy theories, frameworks,
and models are not thoughtful. The situational
paradigm could seem to be a solution for online
privacy research. Speaking of the situational
paradigm, two common questions are brought
up most. First, when it comes to contexts, there
are no concrete answers to questions. It is not
possible to depict privacy, because it depends on
the contexts. Second, how can we account for any
kind of contexts? The situational paradigm does
not mean contexts or situations are omnipotent.
We need to keep in mind that users behavior
patterns and environment risks are not static, but
dynamic, especially in Internet settings. Thus, it
would be unguarded and arbitrary to conclude
what types of persons care more about their privacy and disclose less information. The findings
also give a positive answer to the main questions
of this book: There are no absolute truths on



online consumer protection, but instead theories


of human relativism.
This study sheds new light of how users
gratifications influence their privacy concerns
and practices. This is unprecedented. Internet
users uses and gratification are not static either.
Testing with an a priori theoretical framework, the
SPP shows that the current Internet gratifications
studies are not sufficient to account for album
users behaviors.
This study upholds that future online privacy
research should not take the Internet as a whole,
but rather differentiate technology platforms first.
Moreover, when studying a new technology platform, the researchers have to find social contexts
which are essential to users privacy concerns and
practices and accordingly study what makes users
disclose and hold back their information.
This study also gives hints to policy-making and the information gathering, storage, and
retrieval of Web sites. At the macro level, policy
makers have to recognize the differences between
technology platforms and users needs. As this
books theme suggests, theories of human relativism matter for online consumer protection. The
minimum protection of personal data protection,
including notice/awareness, choice/consent,
access/participation, integrity/security, and enforcement/redress, should be implemented by
legislation.
After fulfilling the basic fair information
principles, the next part takes contexts into
consideration to compose somehow comprehensive principles. Self-regulation is a more proper
mechanism to focus on the details of compliance,
standardization, and consequence based upon context differences. Self-regulation quickly adjusts
the changing technology and provides prompt and
detailed guidelines for data users with adequate
levels of standardization and non-compliance
penalties. Web sites could follow users purposes
to tailor privacy statements and provide useful
services which can obtain valuable information
good for marketing (Hsu, 2006).

Privacy or Performance Matters on the Internet

At the micro level, this study introduces Moors


(1997) zones of privacy for solutions of possible privacy infringements caused by reckless
net users. A zone of privacy is a set of privacy
situations in which people become aware of aspects of their personal information that may be
damaged if made public. In the Internet setting,
with different zones of privacy, individuals can
judge what type and how much information to
keep private and which to make public by their
uses and gratifications and other needs.
Following this logic, a combination of privacy and technology education is suggested to
assure privacy protection for net users. Privacy
education should be taught not only in schools,
but also in the community and society by means
of media and reports. With basic know-how of
online privacy, net users need to learn possible
consequences and further take responsibility of
their information disclosure.

future trends
In previous studies, Web sites categories, regulation, culture/nation, demographics, physical
environment, and technology contexts are all examined. There are many contexts left uncovered.
It is crucial to find more contexts which are able
to predict net users behaviors on the Internet.
It additionally seems that the purposes of
using varied forms of computer-mediated communication (CMC) influence Net users privacy
concerns and practices. It is also essential to find
their purposes of using CMC. Take this research
as an example. The uses and gratifications approach and the SPP provide a glimpse of how
users individual needs influence their privacy
concerns and practices. Users who have more
private gratifications act and are concerned are so
differently from those who have more spectacular
and performance gratifications.
Researchers not only discover the predictive
power of contexts, but also further examine their

interrelationship. The next step is to use structural equation models (SEM) such as LISREL,
to confirm the relationship.

conclusIon
This study explores gratifications of online photo
album users in Taiwan and finds the distinctive
staging phenomenon with media gratifications
and a priori theoretical frameworkthe spectacle/
performance paradigm (SPP). Media drenching,
performance, function, and reference are new
gratifications, which no prior research has found.
Performance, narcissism, and virtual community
gratifications are consistent with the argument of
the diffused audience on the Internet.
This research then examines online privacy
concerns and practices with two contexts: technology platforms (online photo album websites) and
users gratifications (staging phenomenon) under
the situational paradigm. It finds that users gratifications influence users privacy concerns and
practices accordingly. Users with more diffused
audience gratifications are concerned less about
privacy, but do not necessarily disclose more of
their information. They judge the contexts first
before disclosing information.
Studying online privacy therefore needs to
adopt the situational paradigm instead of the
adversarial paradigm. The findings provide implications to policy-making in that online privacy
protection should not be fixed, but rather should
take human relativism into consideration.

future reseArch dIrectIons


The uses and gratifications perspective is one of
the precious few theories that the communication
discipline can really call its own (Lin, 1996).
Theoretically, gratifications can be categorized
into two dimensions: gratifications that result
from the pleasurable experience of media content



Privacy or Performance Matters on the Internet

and are realized during consumption (i.e., process


gratifications) and gratifications that result from
learning information from media content and
subsequently put it to use in practical affairs
(i.e., content gratification). How do both types
of gratification influence privacy concerns and
practices?
The process-content distinction may simply
not be an applicable one in Internet settings
since distinctions between the real world and
the mediated world are vanishing, which is also
the main argument of the SPP paradigm. Online
photo album users consumption is gradually
more woven into the fabric of everyday life.
Societies have become more performative with
an increasing spectacularisation of the social
world and narcissistic individuals. Given that, is
there no difference between these two types of
gratification on privacy concerns and practices?
It is certainly worth examining.
Communication scholars not only can employ
their very own theory on privacy studying, but
can also provide a more focused view on the
communication interaction process than scholars
from other fields, which will facilitate profound
online privacy theories, frameworks, and models. Thus, building an interdisciplinary privacy
research community should be the future research
trajectory.

references
Abercrombie, N., & Longhurst, B. (1998). Audiences: A sociological theory of performance and
imagination. CA: Sage Publication.
Barreto, M., & Ellemers, N. (2002). The impact of
anonymity and group identification on progroup
behavior in computer-mediated groups. Small
Group Research, 33, 590-610.
Bennett, C. J. (1992). Regulating privacy. Ithaca,
NY: Cornell University Press.



Charney R., & Greenberg, B. S. (2002). Uses


and gratifications of the Internet. In C. Lin & D.
Atkin (Eds.), Communication, technology and
society: New media adoption and uses. Cresskill,
NJ: Hampton Press.
Cho, H., & LaRose, R. (1999). Privacy issues
in Internet surveys. Social Science Computer
Review, 17(4), 421-434.
Chou, C., & Hsiao M. C. (2000). Internet addiction, usage, gratification, and pleasure experience:
the Taiwan college students case. Computers &
Education,35, 65-80.
Culnan, M. (1999a). Georgetown Internet privacy policy survey: Report to the federal trade
commission. Retrieved December 10, 2000, from
http://www.msb.edu/faculty/culnanm/gippshome.
html
Culnan, M. (1999b). Privacy and the top 100 web
sites: Report to the federal trade commission.
Retrieved Dec 10, 2000, from http://www.msb.
edu/faculty/culnanm/gippshome.html
Culnan, M., & Armstrong, P. (1999), Information
privacy concerns, procedural fairness, and impersonal trust: An empirical evidence. Organization
Science, 10(1), 104-115.
Dominick, J. (1999). Who do you think you are?
Personal home pages and self-presentation on
the world wide web. Journalism and Mass Communication Quarterly, 76, 646-658.
Dring, N. (2002). Personal home pages on the
web: A review of research. Journal of Computer Mediated Communication 7(3). Retrieved
December 26, 2003, from http://jcmc.indiana.
edu/vol7/issue3/doering.html
Federal Trade Commission. (1998, June). Privacy
online: A report to congress. Retrieved January 2,
2001, from http://www.ftc.gov/reports/privacy3
Federal Trade Commission. (1999, July). Selfregulation and privacy online: A federal trade

Privacy or Performance Matters on the Internet

commission report to congress. Retrieved January


2, 2001, from http://www.ftc.gov/os/1999/9907/
index.htm#13
Federal Trade Commission. (2000, May). Privacy online: Fair information practices in the
electronic marketplace: A federal trade commission report to congress. Retrieved January 2,
2001, from http://www.ftc.gov/os/2000/05/index.
htm#22
Ferguson, D. A., & Perse, E.M. (2000). The world
wide web as a functional alternative to television.
Journal of Broadcasting & Electronic Media,
44, 155-174.
Hall, S. (1980). Encoding/decoding. In S. Hall,
D. Hobson, A. Lowe, & P. Willis (Eds.), Culture,
media, language: Working papers in cultural
studies. London: Hutchinson.
Hine, C., & Eve, J. (1998). Privacy in the marketplace. Information Society, 14(4), 253-262.
Hoffman, D. L., & Novak, T. P. (1996). Marketing
in hypermedia computer-mediated environments:
conceptual foundations. Journal of Marketing,
60(3), 50-68.
Hoffman, D. L., Novak, T. P. & Peralta, M. A.
(1999, April-June). Information privacy in the
marketspace: Implications for the commercial uses
of anonymity on the web. Information Society,
15(2), 129-139.

Retrieved March 26, 2006, from http://www.


ettoday.com/2006/02/17/10845-1906739.htm
Kate, N. (1998, January). Women want privacy.
American Demographics, 20(1), 37.
Katz, E., Blumler, J., & Gurevitch, M. (1974).
Utilization of mass communication by the individual. In J. G. Blumler & E. Katz (Eds.), The uses
of mass communication: Current perspectives
on gratifications research (pp. 19-34). Beverly
Hills, CA: Sage.
Katz, E., Gurevitch, M., & Hass, S. (1973). On the
use of mass media for important things. American
Sociological Review, 38,164-181.
Kaye, B. K. (1998). Uses and gratifications of
the world wide web: From couch potato to web
potato. The New Jersey Journal of Communication, 6, 21-40.
Keith, W. B. (2005). Internet addition: A review
of current assessment techniques and potential
assessment questions. Cyberpsychology & Behavior, 8(1), 7-14.
Kershaw, B. (1994). Framing the audience for theater. In R. Keat, N. Whiteley, & N. Abercrombie
(Eds.), The authority of the consumer. London:
Routledge.
Korgaonkar, P. K., & Wolin, L. D. (1999). A
multivariate analysis of web usage. Journal of
Advertising Research, 39(2), 53-68.

Hsu, C. W. (2002). Online privacy issues: Comparison between net users concerns and web
sites privacy statements. Paper presented to the
52nd Annual Conference of International Communication Association, Seoul, Korea.

LaRose, R., Mastro, D. A., & Eastin, M. S. (2001).


Understanding internet usage: A social cognitive approach to uses and gratifications. Social
Computer Review, 19, 395-413.

Hsu, C. W. (2006). Privacy concerns/privacy


practices: Toward a situational paradigm. Online
Information Review, 30(5), 569-586.

Lea, M., Spears, R., & de Groot, D. (2001). Knowing me, knowing you: Anonymity effects on social
identity processes within groups. Personality and
Social Psychology Bulletin, 27, 526-537.

Jhu, C. H., & Yung, H. C. (2006, February 17).


The hit beauty in online photo album becomes a
TV anchor and a celebrity in showbiz. Ettoday.

Lelia, G. (2001). Treating internet users as audiences: Suggesting some research directions.



Privacy or Performance Matters on the Internet

Australian Journal of Communication, 28(1),


33-42.
Lin, C. A. (1996). Looking back: The contribution
of Blumler and Katzs uses of mass communication
to communication research. Journal of Broadcasting & Electronic Media, 40(4), 574-581.
Livingstone, S. (2004). The challenge of changing
audiences: Or, what is the audience researcher to
do in the age of the Internet? European Journal
of Communication, 19(1), 75-86.
Longhurst, B., Bagnall, G., & Savage, M. (2004).
Audiences, museums and the English middle class.
Museum and Society, 2(2), 104-124.
McKenna, K. Y. A., & Bargh, J. A. (2000). Plan 9
from cyberspace: The implications of the internet
for personality and social psychology. Personality
and Social Psychology Review, 4, 57-75.
Milne, G. R., Culnan, M. J., & Greene, H. (2006).
A longitudinal assessment of online privacy notice
readability. Journal of Public Policy & Marketing, 25(2), 238-249.
Milne, G. R., & Rohm, A. J. (2000). Consumer
privacy and name removal across direct marketing
channels: Exploring opt-in and opt-out alternative. Journal of Public Policy & Marketing, 19(2),
238-249.
Moor, J. H. (1997). Towards a theory of privacy
in the information age. Computers and Society,
27(3), 27-32.
Morris, M., & Ogan, C. (1996). The internet as
a mass medium. Journal of Communication,
46(1), 39-50.
ONeil, D. (2001). Analysis of internet users
level of online privacy concerns. Social Science
Computer Review, 19(1), 17-31.
OSullivan, P. B. (2000). What you dont know
wont hurt me: Impression management functions
of communication channels in relationships. Human Communication Research, 26(3), 403-431.



Palmgreen, P., Wenner, L. A., & Rosengren, K.


E. (1985). Uses and gratifications research: The
past ten years. In K. E. Rosengren, L. A. Wenner,
& P. Palmgreen (Eds.), Media gratifications research: Current perspective (pp. 11-37). Beverly
Hills, CA: Sage.
Papacharissi, Z. (2002). The self online: The utility
of personal home pages. Journal of Broadcasting
&Electronic Media, 46(3), 346-368.
Papacharissi, Z. (2004, May). The blogger
revolution? Audiences as media producers.
Paper presented at the annual convention of the
International Communication Association, New
Orleans, LA.
Parker, B. J., & Plank, R. E. (2000). A use and
gratifications perspective on the internet as a new
information source. American Business Review,
18(2), 43-49.
Plichtova, J., & Brozmanova, E. (1997). Social representations of the individual and the community
well-being: Comparison of the empirical data from
1993 and 1995. Sociologia, 29(4), 375-404.
Postmes, T., Spears, R., Sakhel, K., & de Groot,
D. (2001). Social influence in computer-mediated
communication: The effects of anonymity on
group behavior. Personality and Social Psychology Bulletin, 27, 1243-1254.
Raab, C. D., & Bennett, C. J. (1998). The distribution of privacy risks: Who needs protection? The
Information Society, 14, 263-274.
Reinard, J. C. (2006). Communication research
statistics. CA: Sage.
Sheehan, K. B. (2002). Toward a typology of
Internet users and online privacy concerns. The
Information Society, 18, 21-32.
Sixsmith, J., & Murray, C. D. (2001). Ethical issues in the documentary data analysis of internet
posts and archives. Qualitative Health Research,
11(3), 423-432.

Privacy or Performance Matters on the Internet

Smith, H. J., Milburg, S. J. & Burke, S. J. (1996).


Information privacy: Measuring individuals
concerns about organizational practices. MIS
Quarterly, 20(2), 167-196.
Song, I., LaRose, R., Eastin, M. S., & Lin, C. A.
(2004). Internet gratifications and internet addiction: On the uses and abuses of new media.
CyberPsychology &Behavior, 7(4), 384-394.
Spector, L. (2003, December 26). Guide to online
photo album sites: Heres how to post and share
digital memories of your holidays. PC World. Retrieved April 14, 2004, from http://www.pcworld.
com/news/article/0,aid,114040,00.asp

Young, K. S. (1998). Internet addiction: the emergence of a new clinical disorder. CyberPsychology
& Behavior, 1(3), 237-244.

AddItIonAl reAdIng
Bennett, C., & Raab, C. (2006). The governance of
privacy policy instruments in global perspective
(2nd ed.). Cambridge, MA: The MIT Press
Bird, S. E. (2003). The audience in everyday
life: Living in a media world. N.Y. and London:
Routledge.

Swanson, D. L. (1992). Understanding audiences: Continuing contributions of gratifications


research. Poetics, 21, 305-328.

Blumler, J. G., & Katz, E. (1974). The uses of mass


communications: Current perspectives on gratifications research. Beverly Hills, CA: Sage.

Taylor, C. R., Franke, G. R., & Marynard, M. L.


(2000). Attitudes toward direct marketing and its
regulation: A comparison of the United States and
Japan. Journal of Public Policy & Marketing,
19(2), 228-237.

Detrk, D., Bos, A. E. R., & Grumbkow, J. (2007,


January). Emoticons and social interaction on the
internet: the importance of social context. Computers in Human Behavior, 23(1), 842-849.

Timothy, R. (1999). The construction of the world


wide web audience. Media, Culture & Society,
21(5), 673-684.
Trammell, K. D., Williams, A. P., Postelincu,
M., & Landreville, K. D. (2006). Evolution of
online campaigning: Increasing interactivity in
candidate web sites and blogs through text and
technical features. Mass Communication & Society, 9(1), 21-44.
Turow, J., & Hennessy, M. (2007). Internet privacy
and institutional trust: Insights from a national
survey. New Media & Society, 9(2), 300-318.
Webster, J. G., & Lin, S. (2002). The internet
audience: Web use as mass behavior. Journal of
Broadcasting & Electronic Media, 46, 1-12.
Wei, R., & Lo, V. H. (2006). Staying connected
while on the move: Cell phone use and social connectedness. New Media & Society, 8(1), 53-72.

Earp, J. B., & Baumer, D. (2003). Innovative


web use to learn about consumer behavior and
online privacy. Communication of the ACM,
46(4), 81-83.
Hine, C., & Eve, J. (1998). Privacy in the marketplace. Information Society, 14(4), 253-262.
Howard, P. N., & Jones, S. (Eds.). (2004). Society
Online: The Internet in Context. Thousand Oaks,
CA: Sage.
Hsu, C. W. (2003). A cross-country examination
of online privacy issues: From an adversarial
paradigm toward a situational paradigm--A
comparison of regulations, net users concerns
and practices, and web sites privacy statements
in China, the Netherlands, Taiwan, and the United
States. Unpublished doctoral dissertation, State
University of New York at Buffalo.
Hsu, C. W. (2007). Staging on the internet: Research on online photo album users in Taiwan



Privacy or Performance Matters on the Internet

with the spectacle/performance paradigm (SPP).


CyberPsychology & Behavior, 10(4), 596-600.
Katz, J. E. & Rice, R. E. (2002). Social consequences of internet use: Access, involvement and
interaction. Cambridge MA: The MIT Press.
Lally, L. (1996). Privacy versus accessibility: The
impact of situationally conditioned belief. Journal
of Business Ethics, 15, 1221-1226.
Lin, C. A. (1999). Online-service adoption likelihood. Journal of advertising research, 39 (2),
79-89.
Lin, C. A., & Atkin, D. J. (Jan, 2002). Communication technology and society: Audience adoption
and uses. The Hampton Press.
Milberg, S. J., Smith, H. J., & Burke, S. J. (2000).
Information privacy: Corporate management
and national regulation. Organization Science,
11(1), 35-57.

8

Morris, M., Ogan, C. (1996). The internet as


mass medium. Journal of Computer-Mediated
Communication, 1(4).
Ribak, R., & Turow, J. (2003). Internet power and
social context: A globalization approach to web
privacy concerns. Journal of Broadcasting and
Electronic Media, 47(3), 328-349.
Streeter, T. (1995, May). No respect! Disciplinarity
and media studies in communication-- Introduction: For the study of communication and against
the discipline of communication. Communication
Theory, 5(2), 130-143.
Tavani, H. T. (1999). Privacy online. Computers
and Society, 29(4), 11-19.
Yun, G. W., & Trumbo, C. W. (2000). Comparative
response to a survey executed by post, e-mail, &
web form. Journal of Computer Mediated Communication, 6(1).

Section IV

Consumer Privacy in Business

0

Chapter XI

Online Consumer Privacy


and Digital Rights Management
Systems
Tom S. Chan
Southern New Hampshire University, USA
J. Stephanie Collins
Southern New Hampshire University, USA
Shahriar Movafaghi
Southern New Hampshire University, USA

AbstrAct
While delivering content via the Internet can be efficient and economical, content owners risk losing
control of their intellectual property. Any business that wishes to control access to and use of its intellectual property is a potential user of digital rights management (DRM) technologies. DRM control content
delivery and distribution but it may affect users privacy rights. Exactly how it should be preserved is
a matter of controversy; and the main industry solution is the W3C platform for privacy preferences
(P3P) initiative. But, the issue remains unresolved, and industries and consumers are confronted with
several incompatible standards.

IntroductIon
The Internet has made an unprecedented impact
in daily life. Broadband subscriptions in the U.S.

are projected to jump from 24 million in 2003 to


nearly 50 million in 2008 (Cravens, 2004). The
proliferation of digital technologies in homes,
workplaces, and public places will enable a rap-

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Online Consumer Privacy and Digital Rights Management Systems

idly growing number of people to have access to


the information superhighway. The e-commerce
paradigm is both an opportunity and challenge
to business. It also has implications for content
owners and content users/consumers. While delivering content via the Internet can be efficient
and economical, content owners risk losing control
of their intellectual property. Use of software to
control this delivery and distribution of digital
content may also affect users privacy rights. A
digital method of controlling access to and use
of digital content must be developed, and used
in combination with policies put in place in order
to protect the rights of content owners without
infringing on the rights of users. Any business
that wishes to control access to and use of its
intellectual property is a potential user of digital
rights management (DRM) technologies. DRM
technologies are any of various computer-based
methods of protecting digitally stored content,
usually applied to creative media. These include
various ways of enforcing access control to authorized users only. This may be done through
different strategies or combinations of strategies
that may include:
a.
b.
c.
d.
e.
f.
g.

Identifying authorized users;


Identifying genuine content;
Verifying proof of ownership and purchase;
Uniquely identifying each copy of the content;
Preventing content copying;
Tracking content usage and distribution;
and
Hiding content from unauthorized users.

The entertainment industry is leading the


charge for DRM adoption as a means to preserve
their copyrights. Because the online distribution
of digital content is different from physical
distribution of content incorporated on media
(downloading music vs. buying a CD), many
digital content providers are actively reevaluating

their business models. They are opting to move


from a traditional buy-and-own to a pay-per-use
model with the incorporation of DRM technologies (ClickZ, 2003). However, this trend poses
great risks to consumers and society. DRM technologies weaken the rights to privacy, fair use,
and threaten the freedom of expression (EPIC,
2002). Naturally, everyone is in favor of privacy.
Exactly how it should be preserved is a matter
of controversy; and the main industry initiative
facilitating online privacy is the W3C platform
for privacy preferences initiative.
At the moment, consumers are faced with
several incompatible standards, and thus may face
different kinds of threats to privacy as a result.
When no standard exists, each content distributors
DRM system must be treated individually. Content
distributors continue the struggle with technology
companies over which form of DRM should be
included in the next generation of products and
services. Some providers even made contents
available without DRM protection at all (EMI,
2007). At stake is the future of consumer interactions with, and uses of, copyrighted digital media.
The conflict is between the rights of content owners and the privacy rights of content users. This
chapter will discuss several DRM techniques and
how their use could affect consumer privacy.

producers rights: legal framework


for copyrights
A copyright is the legal right granted to an author,
composer, playwright, publisher, or distributor to
exclusive publication, production, sale, or distribution of a literary, musical, dramatic, or artistic
work. It provides content creators with the legal
right to be paid for, and to control the use of their
creations. The foundation for copyright is based
on the U.S. Constitution which gives congress
the power to promote the progress of science
and useful arts, by securing for limited times to
authors and inventors the exclusive right to their
respective writings and discoveries (LII, 2007a).



Online Consumer Privacy and Digital Rights Management Systems

The copyright law grants copyright holders the


exclusive rights of reproduction, adaptation,
publication, performance, and display subject to
limitations, qualifications, or exemptions (LII,
2007b).
Copyright law is created to uphold a balance,
and it is never absolute. The U.S. Supreme Court
has repeatedly asserted that copyright monopoly
is established primarily to benefit the public interest. For example, exceptions have repeatedly been
invoked to prevent copyright owners from misusing their copyrights in order to stifle legitimate
marketplace competition. Copyright law allows
for certain uses of content without violating the
law as long as the copy is for personal non-profit
use, and not commercial purposes. The generally
accepted fair use exceptions are the following:
a) the private copying exception allows users to
make a back-up copy; b) copy for friends and
family; c) the educational use exception allows
teachers and researchers to copy for research or
study, and as illustration to students; d) the citation
exception allows quotes from content to express
criticism or review; and e) the parody exception
allows quotes from content for the purpose of
caricature or parody.
As with any societal creation, copyright is
affected by the development of new technology.
History has repeatedly proven this point, from the
advent of the printing press, television broadcasting, and now to our brave new digital age. The
last decade has seen tremendous growth and
advancement in new information technologies.
Some argue that the spread of these technologies
marks the end of copyright. The Digital Millennium Copyright Act (DMCA) was enacted to
counter the rapid growth of technology. The act
criminalized the development or use of hardware
or software that cracks digital copy-protection
schemes (USG, 1998). Such actions facilitate
others to access materials that are copyright
protected, and thus break the copyright law. The
creation and distribution of DRM circumvention
tools is illegal. But DMCA favors owners rights



over users rights, because it infringes on fair use.


For example, even when a user copies legitimately
under the fair use exceptions and circumvents the
copyright protection, he or she may be violating
DMCA, a federal statute.

users rights: legal framework for


privacy
The term privacy is used frequently in ordinary
language as well as in philosophical, political, and
legal discussions, yet there is no single definition
of the term. The concept of privacy is a central
tenet of post-enlightenment thought. It affirms
the inviolability of each individuals rights over
her/his own person. The U.S. Supreme Court has
been explicit in ruling that privacy is a central
reason for the Fourth Amendment (LII, 2007c).
Surveillance, eavesdropping, and appropriation
of ones communication are forms of illegal
search and seizure. They threaten rights of personal integrity and self-definition in subtle but
powerful ways. In 1986, privacy protection was
extended to cover electronic communications
of the modern era with the Electronic Communications Privacy Act (LII, 2007d). In general,
it is illegal to acquire and process personal data
unless the persons involved have agreed with the
acquisition and processing of their personal data
and the personal data is required for the execution of whatever process it is used in. It is also
illegal to keep personal data longer than necessary. But, personal privacy is now threatened
even more in the digital information age. There
are massive databases with financial and credit
histories, medical records, daily purchases, and
contacts, that are all accessible via the Internet.
The capability exists for others to mine and link
the databases, but there are few controls over how
the information can be used and distributed. The
predicament makes individual control over ones
privacy next to impossible.
In the context of digital content management,
apart from the protection of personal information,

Online Consumer Privacy and Digital Rights Management Systems

privacy can also be characterized by access and


the right to be left alone within ones personal
physical and virtual space. For example, a persons
home is a place of retreat from the eyes of the
outside world, where one may shed the situational
personae adopted in public, and freely to express
oneself unobserved and unobstructed. Respect for
personal private space is as crucial as liberty and
autonomy. That and selfhood are the hallmarks
of a free society. Thus the concept of privacy can
also be understood as the freedom to dictate the
circumstances, that is, the time, place, method,
duration, and frequency of ones own intellectual
consumption, such as listening to a digital music
download, unobserved and unimpeded by others
(Cohen, 2003). Such freedom is absent in public
places, but rights to privacy afford individuals
the unfettered ability for intellectual consumption
without restrictions in ones private spaces.
Privacy would thus apply to the right to read
digital content without being observed, and
without having the instances of reading tracked
and recorded for later retrieval by anyone. When
a physical object with content is sold by the originator, he or she may never know how or when it
will eventually be used by the buyer, or even if
it will ever be used. The book may not be read,
and the music may never be heard. With digital
content, under some DRM systems, the originator
or seller could have the capability to know when
the content is used, how it is used, and when the
content is passed on to someone else. No longer
is consumption of intellectual content a private
event.

dIgItAl rIghts mAnAgement


technologIes
The earliest forms of DRM in the Internet age
can be traced back to the click-wrap license. The
Uniform Computer Information Transactions Act
(UCITA, 2002) was passed in 2001 to facilitate
e-commerce. Basically, it is traditional commer-

cial law with a digital twist. Software distributors


can use click-wrap licenses as a convenient way
to license software to end users on the Internet.
The statements can be in unformatted text or
in metadata with the rights holders name and
claims over the work provided to users. When
a user agrees to the end user license agreement
(EULA), that user is legally bound by its terms.
There is no easy way to enforce the EULA except
by counting on the users good faith. The success
of the DRM relies upon a high volume and low
cost principle. By making contents affordable,
users can easily acquire the product legally, making illicit activities less appealing. Furthermore,
egregious infringements can be brought under
control using the legal system. Measures such
as registration make illegal copying inconvenient
and burdensome.
With the mainstreaming of personal computers
and the Internet into society, content owners look
upon copy-protection technologies to enclose
and protect their copyrighted digital content. Digital rights management refers to protecting the
copyright of digital content by monitoring content
usage and restricting what actions an authorized
recipient may take with regard to that content.
While some view these technologies as playing
Big Brother and behavior control, DRM is a
necessary mechanism in a free market system. By
preserving property rights, it encourages people
to innovate because they can be more certain of
eventual rewards. DRM gives owners the ability
to securely distribute their valued content and to
control the use of this content, preventing it from
unauthorized distribution (Russ, 2001).
Whether the content is periodicals, books,
photographs, music, or video, DRM can be used
to identify content and manage access, typically
using steganographic and encryption techniques.
While encryption allows only authorized users
to access the message, cryptographic algorithms
generate messages that are easily recognizable as
encrypted. Although the content remains illegible and protected, encrypted messages attract



Online Consumer Privacy and Digital Rights Management Systems

attention; thus their use is undesirable in some


circumstances. Steganography, on the other hand,
embeds a secret message into another message
that serves as a carrier. The goal is to modify the
carrier in an imperceptible way that reveals neither
the embedding of a message, nor the embedded
message itself. Together, steganography and
cryptography are two essential ingredients in
making copyrighted digital contents secure.

digital Watermarking
Steganography is the art and science of hiding
a message in a medium, such as a digital image,
audio, or video file, that can defy detection. Application of steganography can be traced back
to antiquity. In the story of The 300 Spartans,
Demeratus needed to warn the Spartans that Xerxes, King of Persia, was about to invade Greece.
To send the message without detection, Demeratus
removed the wax from a writing tablet, wrote his
message on the wood underneath, and then covered
the message with wax, making the tablet look
like a blank one. While these kinds of methods
worked in times past, they were replaced in the
modern era by more sophisticated techniques such
as invisible inks and microdots. In the computer
age, digital images, as well as audio and video
files, offer a rich medium for hiding an almost
unlimited amount of data.
The rise of peer-to-peer (P2P) networks has
been an inevitable outgrowth with the rise of the
Internet. Unfortunately, P2P networks have grown
from helpful tools in information sharing to havens
for unauthorized copies of copyrighted materials.
Digital content identification or steganography
is crucial in setting up controlled distribution
systems and provide efficient means for copyright
protection. Steganographic techniques can be
divided into two categories: digital watermarking
and digital fingerprinting. While watermarking
is an arrangement of digital bits hidden in the
content of the carrier, fingerprinting is the deriving of a unique identifier based upon the charac-



teristics of the content itself. Watermarking and


fingerprinting are complementary technologies.
While a digital fingerprint is generated from the
digital good itself and perceptible to humans,
a watermark is embedded into the content and
designed to be imperceptible. Digital fingerprints
can only identify content ownership, and are specific to the content itself. Watermarks can hold
other embedded information, for example, usage
entitlement or usage expiration date. Though both
techniques are in their infancy, they are robust
enough to survive duplication and conversion of
digital files which will certainly be attempted by
the copyright infringers.
When used on paper, a watermark is an image that appears on valuable documents to prove
their authenticity and prevent counterfeiting.
The watermark is designed to appear only when
the paper is held at a particular angle, or against
a black background. Standard paper usually
does not include a watermark, as watermarking
the paper incurs an added cost. Currency is an
example of the use of a watermark. By holding
a piece of currency to the light and turning it to
change the viewing angle, a faint image appears.
Prior to the digital age, it was quite complicated
to produce a counterfeit that looked like the
original. This is not the case in the digital world,
where it is extremely easy to duplicate original
digital content without the slightest loss of quality. Similar to an artist artistically signing their
paintings with a brush to claim their copyright,
the concept can be applied to digital products by
embedding a digital watermark in the content to
identify its ownership.
In the digital world, a watermark is inserted
into the content itself. Digital watermarks are so
named because they serve the same purpose as
watermark on papers. Since all software files have
predefined formats that are deterministic, they can
be modeled mathematically and used as a basis for
comparison. Steganography programs alter the
subtle characteristics, such as color, frequency,
tone, noise, and distortion, of a file to generate

Online Consumer Privacy and Digital Rights Management Systems

small variances in the digital fingerprint of the


content. Naturally, they can also take advantage
of unused bits within the file structure to hide
information. Take the least significant bit (LSB)
watermarking technique as illustration. In this
technique, the watermark is stored in the lower
order bits of selected pixels in an image file. The
original 8 bit gray scale image data is compressed
to 7 bits by adaptive histogram manipulation. If
this process is followed by a compensating mapping to restore the dynamic range, the resulting
image is practically indistinguishable from the
original. The process enables the LSB of the
content file to carry the watermark information,
and the watermark can be decoded by comparing
the LSB bit pattern with a stored counterpart.
Because regular software that creates the files
would never produce these variances, the watermark can be detected and recovered.
Watermarks can be embedded in a number of
domains. Embedding is the process of manipulating coefficients in a particular domain in such
as way that the manipulation can be detected
later in that or another domain. For example,
the embedding domain can be spatial where the
coefficients are the pixel values; or frequency,
where a watermark is encoded in the frequency
coefficients. Still image watermarking techniques
can be extended to video signals as video can be
considered as a sequence of still images. For audio
signals, the main constraint is that the watermark
must be inaudible to listeners. One technique is
to add a faint sound as background noise. As
digital audio and video are generally stored and
distributed in compressed format, compression
is the greatest challenge to digital watermarking.
Compression algorithms invariably throw away
some of the data, including parts that belong to
the watermark. Like the numerous compression
techniques themselves, there is neither a perfect
nor absolute watermarking technique. A high
compression rate algorithm reduces the size of
a digital file, but it is likely unable to recover all
the original information when the file is decoded.

There is no perfect lossless compression. Watermarking techniques face a similar trade off. The
stronger the embedded watermarking signal, the
easier it will be to detect, but it is also likely to
affect the quality of the reproduced content (Cox,
Miller, & Bloom, 2002).

digital fingerprinting
A digital fingerprint is a unique pattern that
describes the content for identification purpose.
Digital fingerprinting cannot resist illegal copying, but it enables the copyright owners or content
distributors to track the recipients who leak or
redistribute the fingerprinted content. Unlike
watermarks, a fingerprint is not added into, but
is extracted from the existing characteristics of
the digital content. While the watermark for a
recorded song may be a faint background sound,
the fingerprint would be derived from the songs
tempo, rhythms, the length of verses or movements, the mix of instruments used, or other
features. Fingerprinting identification works by
matching a small sample of digital content to
the original content against a database of fingerprints. Since digital fingerprinting does not add
data to the content, it can be applied to contents
that are already published. On the other hand,
as the fingerprint is derived from the content,
it cannot be used to store information about the
content like a watermark does. Furthermore,
taking the fingerprint of existing content does
require added work and equipment. Once the
fingerprint is created and stored in a database, it
could then perform similar functions as a digital
watermark, acting as a unique identifier for each
piece of content. In any case, to be persistent, a
watermark or fingerprint must be able to survive
any of the digital duplication and transformations
that the copyright infringer will likely to attempt
on the content.
Rather than preventing illegal copying,
steganography is used to detect copying after
the fact. An example of the application is trai-



Online Consumer Privacy and Digital Rights Management Systems

tor tracing (Chor, Fiat, Naor, & Pinkas, 2000).


Copyrighted materials are secured, and a person
who wishes to access the material is required to
submit identification to the DRM system. The
system generates a digital fingerprint according to the user identity. The fingerprint is then
embedded back into the material as a digital
watermark. If leakage of the contents occurs, the
offender could be traced from the fingerprint on
the leaked material. Apart from traitor tracing,
steganography can be indispensable in the pursuit
and location of illegal copyrighted material online. For example, a piece of watermarked digital
music allows the owner to create a search engine
that can find the marked clip on the Internet. The
owner can then determine its legality and seek
remedy. Under U.S. copyright law, to have an
allegedly infringing material removed from a
service providers network, or to have access to
the material disabled, the owner must provide
notice to the provider. Once the notice is given,
or in circumstances where the provider discovers
the infringing material itself, the service provider
network is required to expeditiously remove, or
disable access to, the material.
The mentioned process is called notice and
takedown. Though the process provides some
protection to copyrighted materials, notice and
takedown is a reactive approach. The process takes
place after the infringing material is uploaded on
the network and the damage is presumably already
done. A more proactive technique is fingerprint
filtering. The technology examines a content file
and determines its identity by looking up digital
fingerprints in a database. A service provider
can either forbid the use of a file with an identified fingerprint, or allow it only under specified
conditions. Several P2P sites are adapting the
technology to verify contents uploaded by users in order to reduce the problem of copyright
infringement. If the content is in the fingerprint
database, upload is automatically blocked. Apart
from the accurate monitoring of file transfers, the
technology can also can be vital in systems that



account for and determine appropriate royalties


to artists, music publishers, and record companies
(Kellan, 2001).

encryption and the buy and own


model
Encryption is the process involved in the science
of cryptography, which takes a message in plain
text and performs some kind of transformation
on it so it cannot be understood except by someone possessing the decryption key, which could
be a number, a character, or a set of characters.
Modern DRM technologies use cryptography
to ensure that the permissions made available
by the content owner are not breached. A media
file is encrypted using a key specific for a user.
Individual keys for viewing or listening to the
content are provided to a user who has purchased
the content, but the usage right can include limitations on copying, printing, and redistribution.
When the user opens a content file, the DRM
software checks the users identity and contacts the
owners Web site. If permission is granted, the file
is decrypted and assigned a key for future access.
This type of systems can have a wide range of
sophistication. In the simplest form, contents are
encrypted using a single master key. But, such an
approach is fragile, as compromising one key can
compromise the entire system. The system can
be strengthened using a multiple keys-multiple
encryptions approach similar to the public key
infrastructure (PKI) system. A PKI is an arrangement that binds public keys with respective user
identities by means of a certificate authority (CA).
The user identity must be unique for each CA.
This is carried out by software at a CA, possibly
under human supervision, together with other
coordinated software at distributed locations. For
each user, the user identity, the public key, their
binding, validity conditions, and other attributes
are made un-forgeable in public key certificates
issued by the CA. In the latter approach, the loss
of a single key only means the compromising of a

Online Consumer Privacy and Digital Rights Management Systems

single piece of content, and not the entire system


(Wikipedia, 2007). This method locks content
except to the authorized users. This methodology
can also be used to track users access times, and
specific access sections of content.
In more complex systems, keys are unique
not only for the content but for the function and
device as well. The content owner can configure
access in many ways. For example, a document
might be viewable but not printable. The content
owner may specify that the document may only
be used on a single device or only for a limited
time. By tying access rights directly to physical devices, such as CPUs, hard drives, or other
devices, content owners not only control who is
accessing the information but also protect against
illegal copying or sharing. When copyrighted
content is bound to a physical disk, the user can
loan, destroy, or give the disk away, but when
the content is copied, it would not be viewable
or playable. This is a source management approach where only a simple consumption right,
from owners to users, is allowed. The limitations
of this approach can disadvantage the user. For
example, the user may want to contribute to the
content, and the relationship has to be bidirectional
for that capability to exist. The system can easily
become useless if contents are to be edited, or
multiple sources combined to form a new value
added product (Godwin, 2006).
While most modern DRM technologies use
encryption system such as PKI to control user
access to digital content, some distributors opt
for a strategy of distributing digital content with
embedded DRM software. In this way, copyright
protection would not rely solely upon a users
good faith. Once the content is downloaded onto
a users computer, there is no further contact with
the distributor. Users access the digital content
under restrictions. If one attempts a function
not authorized under the EULA, the embedded
DRM software downloaded and installed with the
content will stop the computers operating system
from executing the requested task.

On a positive note, this approach does not affect user privacy in terms of personal information
gathering and the actual use of digital content
can take place in total privacy, even off-line, free
of obstructions. However, apart from the obvious
security risks, such an approach constitutes a
more serious invasion of privacy by installing
software onto a users computer, which is personal
property, without their full knowledge. Even if
the user consented to the EULA, one could argue such agreement is based on unconscionable
licensing terms. Distributors must be extremely
careful with their embedded DRM software, especially after the SonyBMG episode of creating
rootkit security vulnerabilities in millions of
users computers (EFF, 2005). Embedded DRM
technology should always be subject to an independent and vigorous security review. Even with
such review, the strategy may not be worthwhile
given the legal liability and negative publicity
that could result.
DRM with encryption is less intrusive as it
does not involve installing software on a users
computer. Encryption keys are used to set and
automatically enforce limits on user behavior.
Different keys can have different privileges, and
the keys can even be tied to a particular device
or set of devices. Under this approach, the user
would submit the serial numbers of the devices
where content will be used. The encryption keys
can be generated using the submitted numbers as
a factor. This is a more complicated and persistent
approach, and it requires ongoing and periodic
contact between user and distributor. Users would
want to be able to continue using the content when
they are replacing an old device with new one.
While information collected under this scenario
will be minimal and less sensitive, it does build
up a device ownership database where information can be mined. Users privacy rights can be
affected, since the results of the data mining could
be sold to others.
While DRM encryption does not raise privacy issues in itself, the functional restrictions



Online Consumer Privacy and Digital Rights Management Systems

in place intrude upon the private space and the


autonomy to determine the circumstances of use
for individual private property. Technologies
that constrain user behavior narrow the zone of
freedom traditionally enjoyed for activities in
private spaces. In so doing, they decrease the
level of autonomy that users enjoy with respect
to the terms of use and enjoyment of intellectual
goods. It restricts the freedom of expression.
DRM technologies can also restrict legitimate
fair use. Without the Fair Use doctrine, copyrights
would intrude into everyday life in innumerable
and intolerable ways (Lohmann, 2002). Under
fair use, a creative person can make some use of
anothers work that one believes to be fair. If the
rights holder agrees, use continues. Otherwise,
the case can be resolved in the courts. If, however, such a fair use attempt is prevented from
the outset by DRM technologies, there will be no
experimentation, no product innovations, and no
evolution of new consumer markets.
The Internet Age and pay per use model: a
new model of content distribution from a legal
perspective, buying or acquiring a license for
digital content has very different implications.
Sales involve the complete transfer of ownership
rights in the copy. Copyright law explicitly anticipates the sale of copyrighted products and, by the
first sale rule, constrains a copyright holders
rights in copies of the work that have been sold.
For example, if one purchases a music CD, after
paying for the product, one owns the product as
personal property. One is regulated by property
and copyright law: one can sell the CD, lend, or
resell it to someone else. Licensing, however,
constitutes a limited transfer of rights to use an
item on stated terms and conditions. Licenses are
governed by contract law and, as such, are essentially a private agreement between two parties.
In the case of contents acquired under licensing,
apart from copyright laws, the conditions of
use are specified in the license. The contractual
agreement can involve a wide range of terms and
conditions, and need not incorporate any public

8

policy considerations, beyond basic limits on what


constitutes an enforceable contract.
A new medium is often introduced for its value
in altering the access to and distribution of existing materials. However, the long term success of
the new medium lies within its ability to expand
the horizon of effective communication, enabling
presentation of existing materials in innovative
ways that were not previously possible. Illustrating
the point, when motion pictures were invented
around 1890, early filmmakers saw them primarily as a means of distributing existing material,
for example, stage performances. It took a long
time before movies were recognized as able to
extend expressive possibilities far beyond what is
attainable in a stage performance. The maturity
and potency of a new medium is reflected by the
extent to which the market has taken advantage of
the expanded horizons for communicating ideas
using the new medium which cannot be reproduced under the older medium (Fraser, 1999).
Early digital content found on the Internet was
merely repurposed forms of the analog products
in digital format. As content distributors needed
to find ways to overcome the liabilities of screen
reading and to improve consumer experiences,
they slowly added interactive features for user
control or user-centric contenting. The Internet is
a powerful medium for communication and has
the potential to be much greater than just merely
a new distribution channel. Content owners look
beyond their own Web sites by sending content
to places where they may have more uses or find
more audiences.
Syndication refers to the process by which a
website shares content with other Web sites to ensure the widest possible audience and more overall
traffic. For example, a restaurant reviewer can
syndicate its content to travel Web sites, a mutually
beneficial arrangement for both the Web site and
reviewer. The approach is similar to syndicated
television shows that are produced by studios and
sold to independent stations in order to reach a
broader audience. Several vendors have tools for

Online Consumer Privacy and Digital Rights Management Systems

automating syndication relationships, and there is


an open standard protocol for site interoperability. Taking one step further, and looking beyond
content syndication, the syndication model can
be expanded for content distribution to multiple
and arbitrary parties. Under this model, controls
over usage based on contractual agreements are
mandatory as the trustworthiness of the other
parties is unknown.
Under the new pay-per-use model, authentication is required whenever and wherever the
digital content is used. After downloading the
content from the distributor, users may have the
right to use the work on any device at any time,
but one has to prove ones legitimacy each time
the content is used. The amount of user-tracking
under this approach is much greater, and it is qualitatively more threatening to consumer privacy
than under the buy-and-own model. Potentially
collected information about a user includes collecting a complete listening, reading, and viewing
history. It could also be credit card and financial
information which occurs in conjunction with a
pay-per-use arrangement for access to the content. Not much different from the problem with
spyware, information garnered by DRM systems
in our highly networked society can be used to
build a dossier of a users preferences and usage
patterns. Such information can in turn be sold to
online marketers or even obtained by the government to keep a tight watch on its citizens.
Privacy control in DRM is a complicated
matter because there are legitimate reasons for
distributors to collect data about users and their
activities. For example, the collected data can be
used for traffic modeling, infrastructure planning,
quality of service (QoS), and risk management.
Data mining of aggregated, depersonalized data
can be used for trend spotting, untargeted marketing, and advertising. Yet, data collected for
legitimate reasons can also be used illegitimately,
and by parties who are unknown at the time when
the data is gathered. These parties may be any of
the following: future business partners that the

data gatherer acquires, buyers of aggregated data,


and any number of data brokers. While consumers
may consent for data to be gathered by company
X, the same data may end up being used by others
of whom the consumer will never be aware (Estes,
2007). In todays dynamic business environment,
the mergers, acquisitions, bankruptcies, and other
changes over the life cycle of a corporation can
radically change who has access to what information and how that information may be used and
cross referenced.
The conflict between information gathering
and consumer privacy is not new to e-commerce.
If a company is not able to gather information
about its customers, the company will not be able
to serve its market efficiently, potentially leading
to higher prices. If data collection is inevitable,
but a company has an established privacy policy
and actually adheres to the policy, customers will
also be more comfortable doing business with the
company. The Federal Trade Commission (FTC)
in 2000 issued a report identifying five principles
of fair information practices: notice, choice, access, security, and enforcement. The operating
principle entails a notice to consumers about the
collection, use, and disclosure of information. It
recommended that businesses give a clear and
conspicuous notice of their information practices.
If companies adhere to these principles, they
will not be subject to sanctions (FTC, 2000). It
is not clear, however, whether users have a good
understanding of their rights with respect to data
about themselves. A campaign to publicize these
rights, and inform users may be necessary in order
to allow users to make informed decisions.

A possIble solutIon: p3p And


onlIne prIvAcy
The World Wide Web Consortium (W3C) recently
approved a technology called the platform for
privacy preferences (P3P) as a standard that
would help consumers choose how much of their

9

Online Consumer Privacy and Digital Rights Management Systems

personal information to divulge online. It allows


Internet users to choose the type of information
they will give to websites and if the Web sites can
pass that information on to a third party. With
the P3P specification, a site can prepare computer-readable policy statements, using extensible
markup language (XML) files, to describe how
the organization intends to use personal data during and after a users session. Consumers would
receive alerts if they went to a site whose P3P
policy does not match up with their preferences.
The P3P specification is capable of handling the
privacy notification requirement within a DRM
system. The browser downloads the list of policies from the providers Web site with the digital
content. It then performs logical inferences to
determine if the policies comply with the users
privacy settings. Unfortunately, such a scenario
does require a certain technological sophistication on the consumers part in setting the correct
privacy settings on the browser. P3P has already
been incorporated into many software products,
including Microsofts Internet Explorer 6. But,
the capabilities implemented currently are quite
rudimentary. They are limited to the automated
processing of cookies and display of summary
privacy policies upon request from users.

deploying p3p
P3P implementation on a Web site involves both
business and technical issues; and it does require
a wide range of knowledge and input; from legal,
technical, and marketing perspectives. While it
can be done by a single individual in the case
of a small organization, it is more appropriately
conducted in a team approach involving members
with diverse backgrounds and resources. The first
team assignment is to review the current information practices of the organization; followed
by an in-depth discussion of long term business
plans for the use of consumer information. The
entire organization should be involved in deciding exactly what the privacy policy should be,

0

not just a top down order of implementation. The


review should examine the amount, and types of
information that are collected, the place where
information is collected, the information handling
and processing practices, and the level of sharing
and integration with third parties. The outcome
of the review process will be an accurate human
readable privacy policy that all parts of the organization that interact with consumer information
can agree upon.
Although most Web sites currently post human readable privacy policies, the level of detail
reflected in those policies and adherence to the
policies vary widely. Gaining an accurate and detailed understanding of the consumer information
flowing in and out of the organization is crucial
for P3P implementations. After developing the
organizations privacy policy, the P3P team should
conduct a thorough policy audit. The audit should
examine what information is collected, where it
goes once it is collected, what it is used for, who
uses it, and most importantly, if consumers are
clearly notified of and agree to the stated information practices. In most organizations, it is likely
that inadequacies in the policies and discrepancies
will be uncovered between the published policy
and actual practice through the use of the audit.
Any problem or question must be addressed and
resolved immediately. If the information policy is
incomplete, or the organization does not comply
with its published policy, or the policy does not
accurately and fully disclose all the information
to consumers, it could have serious legal ramifications (W3C, 2002a).
As the focus here is not on P3P syntax, the remaining exploration will be on the P3P deployment
issue instead. For requirement and implementation
details, developers can consult the P3P specification from W3C P3P Working Group (W3C, 2002b).
Most likely, developers will be using a P3P editor
to create the P3P files; and a P3P validator to verify
the file contents. For proper deployment, the site
developer needs to create three separate files: an
XHTML description of privacy policies (policy.

Online Consumer Privacy and Digital Rights Management Systems

html), a P3P reference file (p3p.xml), and a P3P


policy file (policy.xml). These files must be written in correct syntax in accordance with the P3P
specification. In that regard, a P3P editor will be
tremendously helpful in constructing the policy
syntax and descriptions correctly. After the files
are created, they should be stored in a directory
named /w3c off the sites domain root directory.
In addition, to help the browser to locate the P3P
files, the link tag, <link rel=P3Pv1 href=/
w3c/p3p.xml></link> should be added to every
document on the Web site. When a P3P-enabled
browser visits the site with a defined P3P policy,
it automatically locates, reads, and compares the
policy with that of the privacy preferences set on
the browser. The browser then displays an alert
message or blocks transmission of certain data,
if the sites policy differs from the consumers
preference.

future trends
DRM systems can have many variations in implementation details, but they generally conform
to a common structure (Rosenblatt, Trippe, &
Mooney, 2001). Consumers download the digital
content from a Web site. The digital content
contains encrypted content and a key, along with a
license that stipulates the usage right. A software
or hardware controller interprets the licensing
agreement. If authentication and authorization are
successful, the controller allows the consumer to
perform what is intended, such as playing, viewing, or copying of the digital content. However,
within the general framework, there are several
important implementation specific issues that
must be addressed. These issues are:
a.

p3p and privacy policy


b.
While P3P provides a way for sites to make their
privacy policies available in an automated and
structured manner, it does not and cannot enforce
privacy practices. However, consumers should be
aware of a sites stated policy as they have been
warned of what information will be collected, and
how that information will be used. Additionally, a
sites policy is legally binding. The sites operator
is liable for violation of any stated policies. If an
organization states that it is going to do one thing
and does something else, there is no technological process to prevent that at present. However,
deception and illegality are more appropriately
resolved in the arena of public policy, legislation,
and the courts. While P3P is useful for informing consumers about an organizations privacy
policies, it does not directly improve privacy
protections. Nonetheless, P3P implementation
does increase transparency, which is definitely
a positive step forward for both businesses and
consumers (Mulligan & Schwartz, 2000).

c.
d.
e.
f.

g.

How does the system authenticate users? The


options are to use passwords, smart tokens,
another server, or the device itself.
Are the controllers open source or proprietary? There are advantages and disadvantages to both approaches.
How is the licensing bundled: buy-and-own,
pay-per-use, or per site?
How much fine-grained control does the
content owner have?
Are consumers required to be connected to
the network when using the content?
What consumer information will be required
and collected, and how will the information
be used?
How do royalties and payments integrate
with the DRM system?

In reaction to the advent of the Internet and


e-commerce, organizations have been experimenting with DRM technologies to implement
new business models for digital content, as this
is the kind of information that is most suitable for
online distribution. However, there is only modest success. Getting consumers comfortable with
the new ways of consuming digital content is a



Online Consumer Privacy and Digital Rights Management Systems

major reason. Ingrained in consumers attitudes


is a strong feeling that people should be allowed
to do what they wish with the property they pay
for and own, without restriction and fear of being
controlled or monitored. Thus, it is important that
DRM technologies can support such usage expectations. At a minimum, consumers should be able
to use the digital content in any format on any
device they own, and it should also include fair
use rights such as copying for research or personal
non-commercial purposes. The technology must
also be lightweight, seamless, and user friendly.
The system should be deployable on mass market
legacy platforms such as PCs, preferably without
adding new hardware or tedious software installation on the consumers part. It should provide
unbreakable security and without adverse effects
to the consumers platform. Finally, the system
should support multiple content types such as audio, video, text, and software; multiple platforms
and DRM vendors, and allow content migration
between them. Furthermore, all operations should
be as transparent as possible.
Technology developers and content providers hoping to profit from the mass e-commerce
market must agree on a common framework of
interoperability requirements in order that key
technology standards and compatibility can
be resolved. Multiple and incompatible DRM
standards as exist right now will only impede
the development of the digital content market.
Broad adoption of media sharing devices will
be delayed as long as content owners disagree
between themselves on how they wish to benefit
from DRM technologies. Technology providers,
in turn, cannot develop a horizontal market for
connected devices until major content providers
have agreed on a common framework of DRM
interoperability.

conclusIon
Information goods are characterized by negligible
marginal costs, and therefore arguments in favor of


subscription or pay-per-use are stronger for them


than for physical goods. DRM technologies are
an indispensable part of the pay-per-use business
model. While media companies perennially lobby
for pro-DRM legislation, government mandates
must be fashioned with the greatest care because
they limit the ability of innovators to introduce
new technologies as well as the ability of consumers to purchase goods on the free market.
Furthermore, mandating a technically unwise
standard would not be in anyones best interest.
DRM technologies can be a threat to consumer
privacy through the disclosure of consumer data
or the erosion of fair-use rights. Although these
threats are real, DRM technologies can have a
positive impact on consumer privacy. Electronic
transactions are potentially more conducive to
user privacy than older paper distribution channels. DRM systems can be designed to support
anonymous e-payment tokens, allowing usage
tracking without user tracking. Through DRM
technologies, consumers could be enabled to
assign copyright control for their personal data
as owners of that information. Such upstream
copyright licensing would empower consumers
to prevent entities from misusing or reselling
their information.

future reseArch dIrectIons


E-commerce of digital content is a rapidly
expanding market due to an abundant supply of
digital content and the growing practicality of
conducting transactions over the Internet. Delivering content directly to consumers online not only
eliminates physical costs such as paper, printing,
and storage, it allows for highly customizable products and opens up numerous business opportunities. There are many types of DRM system, and
each supports a different business model. Some
interesting topics for exploration include:

Online Consumer Privacy and Digital Rights Management Systems

1.
2.

What are the benefits and challenges of different systems and business models?
What are the current trends in the digital
content distribution and delivery market?

E-learning technology is rapidly changing the


landscape for education products and services.
Publishers, the traditional undisputed leaders in
the educational content market who based their
business process on the production of textbooks,
must now seriously rethink their role and business
model. The digital educational content market
has arrived. The focus is now shifted from the
distribution and sale of tangible products to the
distribution and licensing of intangible products,
from products to the services, and perhaps phasing
out the paper product all together. While e-books
have not experienced mass-market success, digital
technology does provide ample opportunity for
content marketers if one can find the right strategy
and develop an effective business model.
For education purposes, digital content availability and usefulness may seem to be incompatible with DRM, as its restrictions undermine
the educational mission of promoting access,
scholarship, and learning. Academics, teachers,
and learners will expect to be able to reuse and
repurpose digital content directly, to be able to
edit digital content, and to be able to easily combine digital content. Historically, DRM has been
focused on security and enforcement. A DRM
system for an educational institute must also satisfy regulatory requirements, most prominently,
the Family Educational Rights and Privacy Act
(FERPA), the Health Insurance Portability and
Accountability Act (HIPAA), and the Gramm
Leach Bliley (GLB) Act. Furthermore, apart from
fair-use, copyright laws governing an educational
institute also include library exceptions and the
TEACH Act. Given all these, a useful DRM system
for educational content will indeed be a complex
and challenging undertaking.

references
American Heritage. (2006). American Heritage
Dictionary of the English Language (4th ed.).
Retrieved April 16, 2007, from http://dictionary.
reference.com/browse/copyright
Chor, B., Fiat, A., Naor, M., & Pinkas, B. (2000).
Tracing traitors. IEEE Transactions on Information Theory, 44(3), 893-910.
ClickZ. (2003). Users still resistant to paid content.
Jupitermedia, April 11. Retrieved April 16, 2007,
from http://www.ecommerce-guide.com/news/
news/print.php/2189551
Cohen, J. (2003). DRM and privacy. Communications of the ACM, 46(4), 47-49.
Cox, I., Miller, M., & Bloom, J. (2002). Digital
watermarking: Principles & practice. San Diego,
CA: Academic Press.
Cravens, A. (2004), Speeding ticket: The U.S.
residential broadband market by segment and
technology. In-Stat/MDR.
EFF. (2005). Sony BMG Litigation Info, Electronic
Frontier Foundation. Retrieved April 16, 2007,
from http://www.eff.org/IP/DRM/Sony-BMG/
EMI. (2007, April 2). EMI music launches DRMfree superior sound quality downloads across
its entire digital repertoire. EMI Group, Press
Release.
EPIC. (2002, June 5). Letter to house judiciary
subcommittee on the courts, the internet, and intellectual property. The Electronic Privacy Information Center and the Electronic Frontier Foundation.
Retrieved April 16, 2007, from http://www.epic.
org/privacy/drm/hjdrmltr6.5.02.html
Estes, A. (2007, May 6). Bill would ban lenders
alerts to homebuyers. Boston Globe, p. B3.
Fraser, A. (1999, August 8). Chronicle of Higher
Education, 48, p. B8.



Online Consumer Privacy and Digital Rights Management Systems

FTC. (2000). Privacy online: Fair information


practices in the electronic marketplace. Washington, DC: Federal Trade Commission.
Godwin, M. (2006). Digital rights management: A
guide for librarians. Office for Information Technology Policy, American Library Association.
Kellan, A. (2001). Whiz kid has fingerprints all
over New Napster. CNN Technology News, July
5. Retrieved April 16, 2007, from http://edition.
cnn.com/2001/TECH/internet/07/05/napster.
fingerprint/index.html
LII. (2007a). United States Constitution Article
I Section 8, Legal Information Institute, Cornell
Law School.
LII. (2007b). US Code Collection, Title 17, Copyrights, Legal Information Institute, Cornell Law
School.
LII. (2007c). The Bill Of Rights: U.S. Constitution, Amendment IV, Legal Information Institute,
Cornell Law School.
LII. (2007d). U.S. Code Collection, Title 18, S
2510, The Electronic Communications Privacy
Act of 1986, Legal Information Institute, Cornell
Law School.
Lohmann, F. (2002). Fair Use and Digital rights
management, Computers, Freedom & Privacy,
Electronic Frontier Foundation.
Mulligan, D., & Schwartz, A. (2000). P3P and
privacy: An update for the privacy community.
Center for Democracy & Technology. Retrieved
April 16, 2007, from http://www.cdt.org/privacy/
pet/p3pprivacy.shtml

http://www.sans.org/reading_room/whitepapers/
basics/434.php
UCITA. (2002). Uniform Computer Information
Transactions Act. National Conference of Commissioners on Uniform State Laws.
USG. (1998). The Digital Millennium Copyright
Act of 1998, U.S. Copyright Office, Pub. 05-304,
112 Stat. 2860.
W3C. (2002a). How to create and publish your
companys P3P policy in 6 easy steps. W3C P3P
Working Group. Retrieved April 16, 2007, from
http://www.w3.org/P3P/details.html
W3C (2002b). The platform for privacy preferences 1.0 (P3P1.0) specification. W3C P3P
Working Group. Retrieved April 16, 2007, from
http://www.w3.org/TR/P3P/
Wikipedia (2007). Retrieved October 1, 2007, from
http://en.wikipedia.org/wiki/Public_key_infrastructure

AddItIonAl reAdIng
Ahrens, F. (2005). Hard news, daily papers face
unprecedented competition. Washington Post
Sunday, February 20, p. F01
BEUC. (2007). Consumers digital rights initiative,
European consumers organization. Retrieved
June 22, 2007, from http://www.consumersdigitalrights.org
Boiko, B. (2002). Content management bible. New
York: Wiley Publishing Inc.

Rosenblatt, B., Trippe, B., & Mooney, S. (2002).


Digital rights management: Business and technology. New York: M&T Books.

Brands, S. (2000). Rethinking public key infrastructures and digital certificates; building in
privacy. Cambridge, MA: The MIT Press.

Russ, A. (2001). Digital rights management


overview. SANS Information Security Reading
Room, July 26. Retrieved April 16, 2007, from

Collier, G., Piccariello, H., & Robson, R. (2004).


Digital rights management: An ecosystem model
and scenarios for higher education. Educause
Center.



Online Consumer Privacy and Digital Rights Management Systems

Duncan, C., Barker, E., Douglas, P., Morrey, M.


& Waelde, C. (2004). JISC DRM study. Intrallect
Ltd. Retrieved June 22, 2007, from http://www.
intrallect.com/drm-study/DRMFinalReport.pdf
EFF. (2005). Dangerous terms, a users guide to
EULA. Electronic Frontier Foundation. Retrieved
June 22, 2007, from http://www.eff.org/wp/eula.
php

Hasebrook, J. (2002). International e-learning


business: Strategies & opportunities. In Proceedings of the World Conference on E-Learning in
Corp., Govt., Health & Higher Ed. (Vol. 1, pp.
404-411).
LII. (2007). U.S. Code 20.31.3.4.1232g, FERPA.
Legal Information Institute, Cornell Law
School.

EFF. (2007). A users guide to DRM in online music.


Electronic Frontier Foundation. Retrieved 6/22/07
from http://www.eff.org/IP/DRM/guide/

Mazzucchi, P. (2005). Business models and rights


management for e-learning in Europe. INDICARE
Monitor, 2(7), 25-29.

EPIC. (2000). Pretty poor privacy: An assessment


of P3P and internet privacy. Electronic Privacy
Information Center. Retrieved June 22, 2007,
from http://www.epic.org/reports/prettypoorprivacy.html.

Morris, R., & Thompson, K. (1979). Password


security: A case history. Communication of the
ACM, 22(11).

FTC. (2007). Public Law 106-102, GLB. Federal


Trade Commission. Retrieved June 22, 2007,
from http://www.ftc.gov/bcp/conline/pubs/buspubs/glbshort.htm
Green, K. (2004). The 2004 national survey of
information technology in U.S. higher education,
the campus computing project. Retrieved June 22,
2007, from http://www.campuscomputing.net/
Guibault, L., & Helberger, N. (2005). Copyright
law and consumer protection. European Consumer Law Group. ECLG/035/05
Gunter, C., Weeks, S., & Wright, A. (2001). Models
and languages for digital rights. Technical Report
STAR-TR-01-04, InterTrust STAR Lab.
HHS. (2007). Public Law 104-191, HIPAA. Department of Health & Human Services. Retrieved
June 22, 2007, from http://aspe.hhs.gov/admnsimp/pl104191.htm

National Research Council Panel on Intellectual


Property. (2000). The digital dilemma: Intellectual
property in the information age. Washington,
D.C.: National Academy Press. 2000.
Prince, B. (2007). IBM shields online personal
data. eWeek News & Analysis. Retrieved June
22, 2007, from http://www.eweek.com/article2/0,1759,2095275,00.asp
Schaub, M. (2006). A breakdown of consumer
protection law in the light of digital products.
Indicare Monitor, 2(5).
Schwartz, J. (2004). In survey, fewer are sharing
files (or admitting it). The New York Times, Jan.
5, Section C, p. 1.
Shapiro, C., & Varian, H. (1999). Information
rules: A strategic guide to the network economy.
Boston, MA: Harvard Business School Press.
UDDI. (2007). FAQs for UDDI Initiative and
Standard, OASIS Standards Consortium. Retrieved June 22, 2007, from http://www.uddi.
org/faqs.html





Chapter XII

Online Privacy and Marketing:


Current Issues for Consumers and
Marketers
Betty J. Parker
Western Michigan University, USA

AbstrAct
Marketing practices have always presented challenges for consumers seeking to protect their privacy.
This chapter discusses the ways in which the Internet as a marketing medium introduces additional
privacy concerns. Current privacy issues include the use of spyware and cookies, word-of-mouth marketing, online marketing to children, and the use of social networks. Related privacy practices, concerns,
and recommendations are presented from the perspectives of Internet users, marketers, and government
agencies. The chapter concludes with a discussion of the ways in which consumers privacy concerns,
as they apply to Internet marketing, would benefit from additional research..

IntroductIon
Privacy has once again become the price computer
users pay for their use of the information technology infrastructure Mathias Klang, 2004
The Internet is a marketers dream come true.
No other medium comes close to providing the

two-way communication, vast global reach, cost


effectiveness, and tracking capabilities of the
Internet. The Internet may well be on its way to
becoming the medium of choice for advertising
and promotion of both consumer and B2B (business to business) products and services. But all
media have advantages and disadvantages and despite the efficiencies and effectiveness of Internet

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Online Privacy and Marketing

marketing, privacy and security concerns continue


to dominate use of the Internet for marketing
purposes (Hoffman, Novak, & Peralta, 1999).
The marketing industry has long raised many
issues for privacy advocates, even before the
emergence of the Internet as a marketing medium
(Ashworth & Free, 2006; Culnan, 1995; Jones,
1991). Today, the ease with which marketers can
track users preferences and behaviors to serve
them personalized advertisements represents a
brave new world of privacy issues. The magnitude
of data collection on the Internet is enormous and
the FTC has estimated that up to 92% of Web sites
collect personal information (FTC, 1998). Privacy
issues as they apply to the marketing arena are
especially challenging: What could be more personal and potentially damaging to consumers than
the unauthorized sharing of credit or debit card
numbers or public knowledge about ones medical
information or purchases, for example?
The focus of this chapter is Internet privacy in
the context of consumer marketing. The chapter
will provide insights into the ways that online privacy has become a balancing act in which the needs
of businesses are oftentimes balanced against the
needs of consumers. A number of privacy issues
that affect the marketing of products and services
will be presented, along with recommended best
practices. The marketing/privacy issues to be discussed in this chapter are: (1) consumer, marketer,
and government perspectives on data collection,
ownership, and dissemination; (2) online advertising and the use of cookies and spyware; (3)
word-of-mouth marketing and the use of blogs,
sponsored chat, and bulletin boards; (4) marketing
online to children; and (5) privacy issues in social
networks and online communities.
There is currently a gap in the literature
regarding specific online marketing techniques
and privacy issues that impact marketers and
consumers alike. Much of the marketing literature to date has focused on regulations, public
policy, and consumer attitudes toward privacy.
This chapter represents one of the first analyses

of online marketing practices and their associated


privacy issues. Managerial issues and suggested
marketing best practices are also provided.

three perspectIves on onlIne


prIvAcy, dAtA collectIon, And
dAtA oWnershIp
There is consensus about the importance of online
privacy among Internet users, marketers, and government agencies. Online privacy was a concern of
81% of Internet users and 79% of consumers who
buy products on the Internet, according to a tracking study by the U.S. Department of Commerce
(Oberndorf, 1998). Marketers privacy concerns
are reflected in the fact that approximately 77%
of Fortune 500 companies posted a privacy statement on their Web site (Schwaig, Kane, & Storey,
2005). In addition, government agencies play an
important role in the regulation of the medium,
and lawmakers continue to pass laws to increase
online protection. Background information about
key U.S. laws regulating privacy in marketing can
be found in Table 1.
Nevertheless, consumers, marketers, and the
government can present different, and sometimes
competing, perspectives on Internet privacy
issues such as data collection, ownership, and
dissemination. For example, the consumer hopes

Table 1. Online privacy legislation affecting


marketing on the internet
Act

Year

Description

Privacy Act

1974

Forbids government from


gathering or maintaining secret
information about people without
a lawful purpose.

Communications
Decency Act

1996

Regulates indecency and


obscenity on the Internet.

Childrens
Online Privacy
Protection Act

1998

Provides rules and guidelines for


online information collected from
children.

CAN-SPAM Act

2003

Establishes standards for


commercial email use.



Online Privacy and Marketing

to provide little or no information, the marketer


hopes to collect as much consumer information
as possible, and the government seeks to protect
the interests of both while maintaining an open
and fair environment.
Historically, marketing activity has been
viewed as an exchange (Bagozzi, 1975). On the Internet, the user provides personal data in exchange
for what he or she considers useful information or
access to information. But research has revealed
that consumers anticipate risks when providing
personal data. Wang, Lee, and Wang (1998) documented consumers attitudes and privacy concerns
regarding Internet marketing activities including
consumers inability to opt-out of data collection
and third-party distribution of information. An
international survey of consumers revealed that
nearly 20% of Web users felt that magazines have
the right to sell subscribers data to other firms,
but only 12% felt that online marketers had the
same right (Hoffman et al., 1999).
Because the Internet represents two-way communication between the consumer and marketer,
consumers have a unique and easy opportunity to
provide positive or negative feedback to marketers related to privacy issues. One study reported
that as consumers privacy concerns increased,
they were more likely to react in the following
ways: provide incomplete information, notify ISPs
about spam, request removal from mailing lists,
and send a flame to spammers. They were also
less likely to register on the Web site (Sheehan
& Hoy, 1999).
Marketers seek and utilize demographic and
behavioral information from consumers for a
variety of reasons, including targeting purposes.
Ownership of valuable consumer data can be one
of the most important assets of a company and
the collection and reselling of marketing data can
be a profit center for companies. The American
Marketing Associations Internet Marketing Code
of Ethics states that information collected from
customers should be confidential and used only

8

for expressed purposes. However, as is also


true of off-line marketing practices, the ideal is
not always the practice. One reason, suggested
by Caudill and Murphy (2000), is that traditional marketing activities usually require the
consumers knowledgeable participation, while
online information collection (and marketing)
does not. Radin, Calkins, and Predmore (2007)
attributed the problem to some firms that do not
treat collected information respectfully and
responsibly.
A good example of ethical concerns in data
collection, ownership and dissemination can be
found by examining consumer data practices of
ETour.com, an Internet marketer of personalized
virtual Web site tours. Looking to raise capital,
ETour, which had a privacy policy against the
provision of customer data to third parties, sold
registration information from 4.5 million users
and 2.2 million e-mail subscribers to the search
engine Ask Jeeves in 2001, following the dotcom crash. EPIC, a public advocacy group, filed
a complaint with the Federal Trade Commission
(FTC) alleging unfair and deceptive trade practices. EPIC claimed that a precedent had been set
one year earlier when Toysmart.com, an online toy
retailer, was blocked by the FTC from auctioning
its customer lists to pay creditors (Krebs, 2001;
Montana, 2001).
The Federal Trade Commission plays an important role in government policy creation and enforcement. The FTC has identified several norms
(fair information practices principles) governing
ethical use of consumer information:
1.
2.
3.
4.

Notice: The site will provide notice.


Consent: Users should be allowed to choose
to participate or not.
Access: Users should have the ability to
access their data.
Security: Policies should be in place that
will ensure the integrity of data and the
prevention of misuse.

Online Privacy and Marketing

5.

Enforcement: Users should have means to


challenge data collectors who are not following their policies.

Marketers have traditionally preferred selfregulation to government regulation. Self regulation efforts by business, such as the Network
Advertising Initiative (NAI), represent industrys
efforts to protect privacy without the aid of the
federal government. The NAI, which was formed
in 1999 in response to the controversial use by
DoubleClick of click stream data combined with
personal information, represented a historical
moment for privacy protection in marketing. Two
other industry groups, the U.S. Better Business
Bureau and TrustE, now provide downloadable
templates for businesses to provide accurate
privacy policies quickly and easily to corporate
web sites.
Companies have also taken a stand for self
regulation. In 1999 IBM announced that it would
cancel its Internet advertisements from any Web
site that did not provide clear privacy policies
(Auerbach, 1999). Nevertheless, an effective and
comprehensive self-regulatory policy for online
consumer privacy has not yet emerged (Culnan,
2000).
There are a number of recommended best
practices to be applied to the issue of data collection, ownership, and dissemination. In addition
to providing clearly labeled privacy policies and
prominent data collection warnings, marketers
will find it beneficial to practice good public and
customer relationships in order to generate online
registrations and information from consumers.
Building relationships with consumers before
making unsolicited contact or asking them to
register or provide information has been recommended as an important strategy for marketers
(Milne, 1997; Sheehan & Hoy, 1999). There is also
a highly positive relationship between a companys
positive reputation and consumers decisions to
provide accurate personal information to the Web
site (Xie, Teo, & Wan, 2006).

Internet AdvertIsIng
prActIces: the use of
cookIes And spyWAre
The advertising model, in which free or low
cost media content is exchanged for advertising
placements, has been a standard business model
in the off-line world. It appears that the Internet
has embraced a similar advertising-for-content
model, although there are examples of subscription-based content providers (The Wall Street
Journal, for example) and other models of online
exchange. Unlike traditional media such as television, magazines, or newspapers, the Internet can
deliver metrics such as advertising impressions,
click-through rates, and page views on a daily or
hourly basis. Despite the advantages that Internet
advertising provides, this new medium has also
created a new list of privacy issues for consumers
and marketers alike.
The most basic and vital tools in the marketers
toolbox are segmentation of the market and targeting of users, and the Internet helps marketers to
segment and target with great accuracy. Marketers are rapidly shifting mass media dollars to the
Internet, which has grown from .7% of total ad
dollars spent in 1998 to an estimated 3% in 2006,
which translates to $8.7 billion (Media Week
Marketers Guide to Media, 2006).
Behavioral targeting/marketing is online advertising that serves ads to individuals who are
most likely interested in them based on previous
online activity (Shimp, 2007). Behavioral targeting can be an improvement on traditional segmentation and targeting because the marketer has a
strong indication that the individual being targeted
has exhibited a behavior that indicates potential
interest in the product. Behavioral marketing
takes as its premise that past behaviorsuch as
site visits or page viewsis the best predictor
of future behavior. The information technology
forming the basis of behavioral marketing is the
cookie.

9

Online Privacy and Marketing

Cookies, small packets of data placed on a


users hard drive through instructions generated
from a Web page visited by the user, continue to
occupy an important place in discussions of online
marketing and privacy. Cookies are ubiquitous
applications able to function without a users
knowledge or control (Strauss, El-Ansary, & Frost,
2006). Online marketers cite many advantages
to their use: the site owner can measure visitor
traffic, the advertiser can better target their ads,
and the site or e-mail can be tailored to the visitors preferences, and so forth (Dobosz, Green,
& Sisler, 2006).
Cookies have enabled marketers to take unique
actions online. For example, cookies have enabled
marketers to utilize dynamic pricing techniques,
defined as offering different prices to different online consumers (Strauss et al., 2006). Even though
dynamic pricing has been used (and defended)
by marketers such as airline carriers for decades,
the practice has tended to raise eyebrows among
online consumers and watchdog groups. When
Amazon was discovered to have charged customers different prices for the same CDs, the company
was accused of price discrimination based on
demographics. Amazon defended the practice as
part of a randomized test (Daub, 2001).
The utilization of cookies by marketers
has many positive aspects for consumers. For
example, cookies allow users to return to sites
without having to log in or register each time.
Most consumers are either unaware of cookies
or unconcerned by their placement, because their
identity remains anonymous to the marketer.
The DoubleClick controversy of 2000, in which
the interactive advertising agency DoubleClick
combined demographic database information with
online activity, illustrated that marketers have the
capability, if not the intention, to do more than
just monitor anonymous activity with cookies
(Strauss et al., 2006). Many users attribute data
mining capabilities to cookies that marketers say
are impossible: provision of actual names and
addresses, e-mailing capabilities, and so forth.

0

More likely, they are referring to spyware (also


called adware).
Spyware technology is similar to that of cookies, but it has a different, sometimes malevolent,
purpose. The spyware program creates a unique
user identifier and collects user data which can
be sent to the collector of the information and
can be installed unintentionally when the user
installs another piece of software (Klang, 2004).
Spyware can generate pop-up ads, send spam to an
e-mail inbox, and crash a computer (Strauss et al.,
2006). Technologies such as spyware, Web bugs,
transactional database software, and chat-room
analysis tools can monitor e-mail and keystrokes
in order to serve ads to consumers even when they
are not online (Turow, 2003).
The use of cookies and adware by marketers
may continue to serve as the primary technology
tool enabling behavioral marketing. It may also
be a first-generation marketing phenomenon that
will not last. Because of the potential for (and
numerous cases of) privacy intrusion, it is possible that legislation to limit the use of cookies
and spyware will be enacted. To some degree,
consumers have been addressing the issue themselves, because cookie rejection rates continue to
increase. A survey by Jupiter Research reported
that 58% of Web users have deleted cookies, with
39% of respondents reporting monthly deletions
of cookies (Savvas, 2005).
As Chen and Rea (2004) point out, there is a
tension between technologies that take information from consumers (cookies, for example) and
those that allow consumers the opportunity to
share information or not (P3P technologies, for
example). Because opt-in and opt-out choices
are often buried in privacy notices that many
consumers do not read, consumers do not always
realize the privacy implications of their online
activity. An important step in the protection of
online privacy is the inclusion of prominent, easy
to read and understand opt-in and opt-out choices
for the consumer.

Online Privacy and Marketing

prIvAcy And Word-of-mouth


mArketIng: blogs, sponsored
chAt rooms, And bulletIn
boArds
Marketing often consists of push techniques,
where sales or marketing information is sent to
consumers and/or the distribution channel in
the form of advertisements, coupons, and press
releases, for example. The Internet provides a
unique opportunity for marketers to utilize pull
marketing techniques whereby the consumer requests or seeks out information about products or
services via blogs, bulletin boards, and sponsored
chat rooms. Pull marketing can be an effective
marketing strategy because the consumer is
actively involved in seeking information about
desired products or services.
It has been reported that Generation Y, consumers born between 1979 and 1994, exhibit a
deep mistrust of advertising and prefer to rely on
product information provided by their friends for
their purchases (Belch & Belch, 2004). Previous
studies have also found that young people are
significantly more likely to shop online (Joines,
Scherer, & Scheufele, 2003). These preferences
have spurred the growth of marketing communication techniques referred to as stealth, buzz,
or word-of-mouth marketing. To create buzz
about their products, companies, such as Proctor & Gamble, have been quietly recruiting paid
product fans to indirectly market their products
through word-of-mouth marketing techniques
such as personal endorsements and providing favorable product reviews or information on bulletin
boards and in chat rooms (Berner, 2006).
Much has been made recently of the use of
blogs for corporate marketing and public relations
purposes. Major corporations such as Boeing,
Walt Disney, and McDonalds are utilizing blogs
on their Web sites to share information with the
public and to monitor public opinion (Holmes,
2006). Some companies are finding that the role
that journalists used to play in the publicity of

new products and other announcements is now


being offered to bloggers to build support for their
products. In 2005, General Electric executives met
with environmental bloggers prior to making and
announcing investments in energy-efficient technology. Microsoft contacted bloggers to promote
their Xbox game system. Wal-Mart has utilized
blogs since 2005 to promote positive news about
itself and to repair its image (Barbaro, 2006).
Companies are finding that bloggers are often more
accessible and interested in doing more than just
reporting the news. Oftentimes, bloggers are just
as interested in shaping public opinion and they
can be eager to provide a critical assessment of
the companys products or marketing activities.
Privacy issues related to communication forums such as blogs, chat rooms, and bulletin boards
are many and are as likely to affect the marketer
as the consumer. For example, what happens to
the product or companys reputation if or when
it is discovered that the person(s) providing the
favorable information has been paid or provided
with cash or free products? Is it possible that a
company might encounter a backlash if their
marketing technique becomes widely known to
the general public?
The privacy that many feel when they adopt
a username and chat anonymouslythey believecan reduce inhibitions. But what happens
when the identity of the chatter is revealed? The
recent case of John Mackey, CEO of Whole
Foods, illustrates the risky nature of blogs and
online chat. For 8 years, Mackey posted comments on Yahoo Finance stock forums under
a pseudonym. His comments included positive
statements about his company and disparaging
comments about a competitor, Wild Oats, which
would become an acquisition target for Whole
Foods in 2007. Mackeys comments came to light
only after the FTC made the comments public in
its antitrust lawsuit to block the sale (Kesmodel
& Wilke, 2007).
The data ownership and dissemination issues
discussed earlier can be troubling for the millions



Online Privacy and Marketing

of consumers who join online communities. In


1999, GeoCities, a virtual community Web site,
sold personal information provided by its members to third parties. The FTCs director of the
Bureau of Consumer Protection called the case a
message to all Internet marketers that statements
about their information collection practices must
be accurate and complete. He added that monitoring and enforcement would be forthcoming
(Beltramini, 2003).

onlIne mArketIng to chIldren


Data Monitor estimated that over 65 million
children in the U.S. and Europe had access to the
Internet at home (Bennett, 2006). Marketing to
children, who comprise a significant proportion
of online users, represents both an opportunity
and a dilemma for marketers. On one hand, children and teenagers comprise an attractive target
market for many products including clothing,
entertainment, and food (Day, 1999). One study
reported that U.S. companies spent an estimated
$15 billion just to advertise and market to children
age 12 and under (Munoz, 2003).
But the opportunities for marketing to children
can be overshadowed by government policies
put in place to protect children, who represent
a vulnerable population requiring oversight and
protection (Shimp, 2007). In addition to ongoing
government regulation, corporate policies have
been enacted in response to negative publicity and
to avoid future regulation. In 2007, for instance,
Kelloggs Corporation announced that it would
stop advertising products that did not meet specific nutritional guidelines to children under age
12, because of concerns about the impact of food
products on childhood obesity (Martin, 2007).
It has been estimated that about 75% of 14 to
17 year-olds and 65% of 10 to 13 year olds used
the Internet in 2001 (Cooper & Victory, 2002)
and would spend $1.3 billion online in 2002
(Day, 1999). The ubiquity of home and school



computers and the emphasis on computer usage


in education makes children and teenagers skilled
users of computers and prime targets for online
marketing of goods and services.
Two academic studies of children and teenagers illustrate that privacy issues for children are
often age dependent and that older children (teenagers) are more aware of the risks and more likely
to manage online risks than children. A content
analysis by Cai and Gantz (2000) found that a
majority of Web sites analyzed (n=163) collected
information from children without disclosure or
parental involvement. In contrast, Youns (2005)
study of 326 high school students online behaviors
found that teenagers who exhibited a higher level
of perceived risk for information disclosure were
less willing to provide information. They also
utilized risk-reducing strategies such as providing
false or incomplete information.
These studies point to the need to continue to
monitor behaviors and provide legislation protections for our youngest consumers, especially
compliance that includes parental consent and
involvement. The Childrens Online Privacy Protection Act (COPPA) of 1998 required Web sites
that target children to comply with a number of
privacy protections, including requiring online
users under the age of 13 to submit verifiable
parental consent before collecting, using, or
disclosing personal information (Federal Trade
Commission, 1999). Others have called for private intervention, including more education and
information in the form of privacy awareness
programs (Lewandowski, 2002).

prIvAcy Issues And socIAl


netWorks
Marketing via social networks is a fast growing
technique available to marketers hoping to reach
a particular demographic group. Some social
networks, such as LinkedIn, are targeted toward
working professionals. Others, such as FaceBook

Online Privacy and Marketing

and MySpace, have been designed for young


people and students. Marketers who want to reach
the highly sought-after 18-24 demographic have
become the mediums biggest fans. Mainstream
marketers such as Honda and Adidas are purchasing banner ads, sponsoring content and marketing
directly to the tens of millions of registered social
network members.
Marketing on social networks provides many
advantages to the marketer. The social network
contains a large target market that has self-identified by age, educational status, sex, and so forth.
The cost of Internet promotions can be less expensive than those using mainstream media. Mediarestricted or regulated products such as tobacco
and alcohol can find a welcome audience.
Nevertheless, there have been many well-publicized examples of privacy issues associated with
popular social networks: minors meeting sexual
predators online, the public dissemination of the
names of sex offenders registered on My Space,
and countless stories about the publication of tasteless photographs and content by social network
members who thought that their information was
private, despite its posting on a global medium.
In spite of the negative publicity about social
networks, marketing on social networking sites
makes good sense for those marketers seeking to
make contact with large groups of consumers.
Even though operators of social networking
sites are immune from liability for content posted
by their users, there is an element of potential risk
for marketers. Public support for safety measures
such as age verification procedures to protect the
under-age population is growing (Christ, Berges,
& Trevino, 2007). Because of the rapidly changing
environments for technology-driven marketing
opportunities such as social networks, it is recommended that mainstream marketers frequently
evaluate whether this is an arena they wish to
operate in and whether privacy issues will help
or hinder them in their efforts to market to the
young. Just as sites targeted at children have incorporated information for parents into their content,

it is recommended that social networks aimed at


minors take the lead in creating and promoting
a safe online environment and communicating
that information to parents, educators, and other
authority figures.

conclusIons And
recommendAtIons for
further reseArch
This chapter has provided an overview of some
of the key issues related to privacy in marketing
contexts and recommendations for the future.
While it is not an exhaustive review of all marketing-related issues, it is a comprehensive discussion
of several timely and important online marketing
privacy issues affecting marketers and consumers.
But there will be many more marketing issues
to analyze and critique, particularly when one
considers the dynamic nature of both marketing
and the Internet medium.
The marketing issues presented in this chapter
provide fertile research opportunities because
many of the marketing technologies and techniques discussed here are relatively new and
changing rapidly. Privacy policy disclosure and
clarity is one area where we can expect to see
activity in the future. Much of the literature about
marketing and online privacy recommends the
inclusion of clearly written, prominently displayed
privacy policies. The thinking by many of those
who study this issue is that information is powerful and that awareness is the first step towards
compliance. Much needs to be done, because
only 5% of Fortune 500 companies were found to
fully comply with the guidelines for provision of
privacy policy notices on their Web sites (Hong,
McLaughlin, Pryor, Beaudoin, & Grabowicz,
2005). It may be that the provision of privacy
policies could become a Web-based requirement
for e-commerce, similar to the financial disclosures consumers must receive prior to investing
in a mutual fund. Turow (2003) recommends that



Online Privacy and Marketing

the federal government require P3P protocols on


all Web sites. Future research could be focused
on monitoring government and private sector
proposals and actions to discover whether selfregulation or government regulation of marketing
privacy activities ultimately leads to more and
better information for consumers.
The growth of Web 2.0 applications also represents an intriguing area of research into online
privacy. Interesting research could be conducted
about the growing use of public blogs for marketing purposes, for example. Boeings eventual
positive experiences with blogs did not begin
well because reader comments were not allowed
initially (Holmes, 2006). Content analysis could be
performed that analyzes blog structure and design
and the quantity or variety of public commentary
among blogging companies, non profits, or other
institutions. Analysis of monitoring policies for
the use of blogs and chat rooms by corporations
and other public entities is another recommended
area of research that would be helpful to managers
planning communication strategies.
Privacy issues in e-mail marketing are also
worthy of research, especially because privacy
issues in electronic mail are an unsettled aspect
of online interaction (Strauss et al., 2006). An
international survey of consumers revealed that
nearly 21% of respondents like to receive direct
postal mail, but only 6% of them like to receive
commercial e-mail (Hoffman et al., 1999).Much
of the dislike can probably be attributed to the
massive and continued use of commercial SPAM
and concerns about viruses in attachments, topics ripe for further study. An interesting e-mail
privacy issue would entail understanding users
perceptions of privacy related to Googles Gmail,
which trades server space for monitoring of users
words to match them with product ads (Dobosz et
al., 2006). The current state of e-mail marketing
practices in light of the CAN-SPAM Act of 2003
would also be beneficial to study.



Mobile computing offers new opportunities


to marketers. Mainstream marketers such as
Coca-Cola and McDonalds continue to utilize
commercial text messaging and other mobile
promotions (Sultan & Rohm, 2005). The ability
to send effective promotional messages to cells
phones is currently limited by factors such as the
size of the screen and the risk of consumer irritation. Mobile advertising also requires different
techniques for monitoring advertising effectiveness (Strauss et al., 2006).
Future research will undoubtedly consider not
only the feasibility of this high potential medium
but also the associated privacy issues, including the
identity, ownership, and use of consumers phone
records. Countries such as India are betting heavily
on a mobile (cell phone) computing technology
platform in the future. Their experiences with
mobile computing will provide fertile territory
for mobile privacy research as the world moves
to the adoption of a mobile personal computing
platform that is likely to include other advertising
and privacy issues.
Marketing to children and young people
will continue to be an area fraught with privacy
concerns both online and off. Some of the same
media strategies that have been used for alcohol
and tobacco awareness could be utilized with
these groups. An interesting experiment could
be conducted that tested childrens awareness of
privacy issues and safe behaviors before and after
an advertising treatment and served to students
in a school lab or library setting.
Privacy and security issues will continue to
dominate the online marketing landscape and
there will be numerous opportunities to learn
from others online marketing successes and
mistakes. Efforts by industry, the government,
and watchdog groups to monitor practices and
propose improvements for protecting the privacy
of all users will be ongoing.

Online Privacy and Marketing

references
Ashworth, L., & Free, C. (2006). Marketing
dataveillance and digital privacy: Using theories of justice to understand consumers online
privacy concerns. Journal of Business Ethics,
67, 107-123.
Auerbach, J. G. (1999, March 31). To get IBM ad,
sites must post privacy policies. The Wall Street
Journal, pp. B1, B4.

Christ, R. E., Berges, J. S., & Trevino, S. C. (2007).


Social networking sites: To monitor or not to monitor users and their content? Intellectual Property
& Technology Law Journal, 19(7), 13-17.
Cooper, K. B., & Victory, N. J. (2002, February).
A nation online: How Americans are expanding
their use of the internet. Washington, DC: U.S.
Department of Commerce.

Bagozzi, R. (1975). Marketing as exchange. Journal of Marketing, 39(4), 32-39.

Culnan, M. J. (1995). Consumer awareness of


name removal procedures: Implications for direct marketing. Journal of Direct Marketing, 7,
10-19.

Barbaro, M. (2006, March 7). Wal-Mart enlist bloggers in P.R. campaign. The New York
Times.

Culnan, M. J. (2000). Protecting privacy online:


Is self-regulation working? Journal of Public
Policy & Marketing, 19(1), 20-26.

Belch, G. E., & Belch, M. A. (2004). Advertising and promotion: An integrated marketing
communications perspective (6th ed.). New York:
McGraw-Hill/Irwin.

Daub, T. R. (2001). Surfing the net safely and


smoothly: A new standard for protecting personal
information from harmful and discriminatory
waves. Washington University Law Quarterly,
79, 913-949.

Beltramini, R. F. (2003). Application of the unfairness doctrine to marketing communications


on the Internet. Journal of Business Ethics, 42(4),
393-400.
Bennett, C. (2006). Keeping up with the kids.
Young Consumers, Quarter 2, 28-32.
Berner, R. (2006, May 29). I sold it through the
grapevine. Business Week, pp. 32-33.
Cai, X., & Gantz, W. (2000). Online privacy
issues associated with web sites for children.
Journal of Broadcasting & Electronic Media,
44(2), 197-214.
Caudill, E. M., & Murphy, P. E. (2000). Consumer
online privacy: Legal and ethical issues. Journal
of Public Policy & Marketing, 19(1), 7-19.
Chen, K., & Rea, A. (2004). Protecting personal
information online: A survey of user privacy concerns and control techniques. Journal of Computer
Information Systems, 44(4), 85-92.

Day, J. (1999, February 22). Report: Teen online


spending increases. Retrieved July 12, 2007, from
http://www.ecommercetimes.com/perl/story/366.
html
Dobosz, B., Green, K., & Sisler, G. (2006). Behavioral marketing: Security and privacy issues.
Journal of Information Privacy & Security, 2(4),
45-59.
Federal Trade Commission. (1998). Privacy
online: A report to congress. Retrieved July 14,
2007, from http://www.ftc.gov/reports/privacy3/
index.shtm
Federal Trade Commission. (1999). Childrens
Online Privacy Protection Rule, Federal Register,
64, 212. (November 3), 59888-59915.
Hoffman, D., Novak, T. P., & Peralta, M. (1999).
Building consumer trust online. Association for
Computing Machinery. Communications of the
ACM, 42(4), 80-85.



Online Privacy and Marketing

Holmes, S. (2006, May 22). Into the wild blog


yonder. Business Week, pp. 84-86.
Hong, T., McLaughlin, M. L., Pryor, L., Beaudoin,
C., & Grabowicz, P. (2005). Internet privacy practices of news media and implications for online
journalism. Journalism Studies, 6(2), 15-28.
Joines, J. L., Scherer, C. W., & Scheufele, D.
A. (2003). Exploring motivations for consumer
web use and their implications for e-commerce.
The Journal of Consumer Marketing, 20(2/3),
90-108.
Jones, M. (1991). Privacy: A significant marketing
issue for the 1990s. Journal of Public Policy &
Marketing, 10(spring), 133-148.
Kesmodel, D., & Wilkie, J. R. (2007). Whole foods
is hot, wild oats a dudso said Rahodeb. The
Wall Street Journal, 250(9), A1.
Klang, M. (2004). Spyware-the ethics of covert
software. Ethics and Information Technology,
6, 193-202.
Krebs, B. (2001). Etour.com data sales violate
policy. Newsbytes News Network. Retrieved July
13, 2007, from
http://findarticles.com/p/articles/mi_m0NEW/
is_2001_May_29/ai_75139832
Lewandowski, J. (2002). Stepping off the sidewalk:
An examination of the data collection techniques
of web sites visited by children. Doctoral dissertation, Purdue University, W. Lafayette, IN.

Montana, J. C. (2001). Data mining: A slippery


slope. Information Management Journal, 35(4),
50-52.
Munoz, S. (2003, November 11). Nagging issue:
Pitching junk to kids. The Wall Street Journal
Online. Retrieved July 14, 2007, from http://online.wsj.com
Oberndorf, S. (1998). Users remain wary. Multichannel Merchant. Retrieved July 14, 2007, from
http://multichannelmerchant.com/news/marketing_users_remain_wary/
Radin, T. J., Calkins, M., & Predmore, C. (2007).
New challenges to old problems: Building trust
in e-marketing. Business and Society Review,
112(1), 73-98.
Savvas, A. (2005, March 22). Monitoring made
harder by cookie security fears, Computer Weekly.
Retrieved July 14, 2007, from
http://www.computerweekly.com/Articles/Article.aspx?liArticleID=208948&PrinterFriendl
y=true
Schwaig, K. S., Kane, G. C., & Storey, V. C.
(2005). Privacy, fair information practices, and
the Fortune 500: the virtual reality of compliance.
The DATA BASE for Advances in Information
Systems, 36(1).
Sheehan, K. B. & Hoy, M. G. (1999). Flaming,
complaining, abstaining: How online users respond to privacy concerns. Journal of Advertising,
28(3), 37-51.

Martin, A. (2007, June 14). Kellogg to phase out


some food ads to children. The New York Times.
Retrieved July 11, 2007, from www.nytimes.
com/2007/06/14/business/14kellogg.html?ex=11
83953600&en=8f34f9be3c1620f7&ei=5070

Shimp, T. (2007). Advertising, promotion, and


other aspects of integrated marketing communications (7th ed.). Mason, OH: Thomson SouthWestern.

Media Week marketers guide to media (Vol. 29).


(2006). New York: VNU Business Publications.

Strauss, J., El-Ansary, A., & Frost, R. (2006).


E-marketing (4th ed.). Upper Saddle River, NJ:
Pearson Prentice Hall.

Milne, G.R. (1997). Consumer participation in


mailing lists: A field experiment. Journal of Public
Policy & Marketing, 16, 298-309.


Sultan, F., & Rohm, A. (2005). The coming era


of brand in the hand marketing. MIT Sloan
Management Review, 47(1), 82-90.

Online Privacy and Marketing

Turow, J. (2003). Americans & online privacy:


The system is broken. Report from the Annenberg Public Policy Center of the University of
Pennsylvania, June.
Wang, H., Lee, M. K. O., & Wang, C. (1998).
Consumer privacy concerns about internet marketing. Association for Computing Machinery.
Communications of the ACM, 41(3), 63-68.
Xie, E., Teo, H-H, & Wan, W. (2006). Volunteering personal information on the internet: Effects
of reputation, privacy notices, and rewards on
online consumer behavior. Marketing Letters,
17, 61-74.
Youn, S. (2005). Teenagers perceptions of online
privacy and coping behaviors: A risk-benefit
appraisal approach. Journal of Broadcasting &
Electronic Media, 49(1), 86-110.

AddItIonAl reAdIngs
Cantos, L., Fine, L., Porcell, N., & Selby, S. E.
(2001). FTC approves first COPPA safe harbor
application. Intellectual Property & Technology
Law Journal, 13(4), 24.
Caudill, E. M., & Murphy, P. E. (2000). Consumer
online privacy: Legal and ethical issues. Journal
of Public Policy & Marketing, 19(1), 7-19.
Dobrow, L. (2006). Privacy issues loom for marketers. Advertising Age, 77(11), S6.
Dommeyer, C., & Gross, B. L. (2003). What consumers know and what they do: An investigation
of consumer knowledge, awareness, and use of
privacy protection strategies. Journal of Interactive Marketing, 17(2), 34-51.
Eastlick, M. A., Lotz, S. L., & Warrington, P.
(2006). Understanding online B-to-C relationships: An integrated model of privacy concerns,
trust, and commitment. Journal of Business
Research, 59, 877-886.

George, J. F. (2004). The theory of planned behavior and Internet purchasing. Internet Research,
14(3), 198-212.
Han, P., & Maclaurin, A. (2002). Do consumers really care about online privacy? Marketing
Management, 11(1), 35-38.
Hann, I., Hui, K., Lee, T., & Png, I (2002). Online
information privacy: Measuring the cost-benefit
trade-off. In Proceedings of the Twenty-third
International Conference on Information Systems.
Heckman, J. (1999). E-marketers have three childprivacy options. Marketing News, 33(16), 5-6.
Klang, M. Y., Raghu, T. S., & Shang, K. H-M
(2000). Marketing on the Internetwho can benefit from an online marketing approach? Decision
Support Systems, 27(4), 383-393.
Langenderfer, J., & Cook, D. L. (2004). Oh, what
a tangled web we weave: The state of privacy
protection in the information economy and recommendations for governance. Journal of Business
Research, 57, 734-747.
Luo, X. (2002). Trust production and privacy
concerns on the Internet: A framework based
on relationship marketing and social exchange
theory. Industrial Marketing Management, 31,
111-118.
Metzger, M. J., & Docter, S. (2003). Public opinion
and policy initiatives for online privacy protection.
Journal of Broadcasting & Electronic Media,
47(3), 350-374.
Milne, G. R., & Culnan, M. J. (2004). Strategies
for reducing online privacy risks: Why consumers
read (or dont read) online privacy notices. Journal
of Interactive Marketing, 18(3), 15-29.
Miyazaki, A. D. & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. Journal of Consumer Affairs,
35(1), 27-44.



Online Privacy and Marketing

Morrison, K. L. (2003). Children reading commercial messages on the Internet: Web sites that
merge education, information, entertainment, and
advertising. Doctoral dissertation, University of
California, Los Angeles.
Palmer, D. E. (2005). Pop-ups, cookies, and spam:
Toward a deeper analysis of the ethical significance of Internet marketing practices. Journal of
Business Ethics, 58, 271-280.
Pan, Y., & Zinkhan, G. M. (2006). Exploring the
impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72, 131-148.
Schwartz, G. (2003). Mobile marketing 101.
Marketing Magazine, 108(26), 21.
Sheehan, K. B. (2002). Toward a typology of
Internet users and online privacy concerns. The
Information Society, 18(1), 21-32.

8

Sheehan, K. B., & Hoy, M. G. (2000). Dimensions


of privacy concern among online consumers.
Journal of Public Policy & Marketing, 19(1),
62-73.
Stewart, D., & Zhao, Q. (2000). Internet marketing: Business models and public policy. Journal
of Public Policy & Marketing, 19(2), 287-296.
Vascellaro, J. (2007, April 30). Virtual worlds now
cater to kids, but are they safe? The Wall Street
Journal, p. B.1.
Wijnholds, H., & Little, M. W. (2001). Regulatory
issues for global e-tailers: Marketing implications. Academy of Marketing Science Review,
2001, 1-12.
Yang, S., Hung, W., Sung, K., & Farn, C. (2006).
Investigating initial trust toward e-tailers from
the elaboration likelihood model perspective.
Psychology & Marketing, 23(5), 429.

9

Chapter XIII

An Analysis of Online Privacy


Policies of Fortune 100
Companies
Suhong Li
Bryant University, USA
Chen Zhang
Bryant University, USA

AbstrAct
The purpose of this chapter is to investigate the current status of online privacy policies of Fortune
100 Companies. It was found that 94% of the surveyed companies have posted an online privacy policy
and 82% of them collect personal information from consumers. The majority of the companies only
partially follow the four principles (notice, choice, access, and security) of fair information practices.
For example, most of the organizations give consumers some notice and choice in term of the collection and use of their personal information. However, organizations fall short in security requirements.
Only 19% of organizations mention that they have taken steps to provide security for information both
during transmission and after their sites have received the information. The results also reveal that a
few organizations have obtained third-party privacy seals including TRUSTe, BBBOnline Privacy, and
Safe Harbor.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

An Analysis of Online Privacy Policies of Fortune 100 Companies

IntroductIon
Privacy is defined as the right to be let alone
which is part of the basic human rights to enjoy
life (Warren, 1890). As an extension of privacy
in the information age, information privacy is
the legitimate collection, use, and disclosure of
personal information, or the claims of individuals
that data about themselves should generally not be
available to other individuals and organizations,
and that, where data is possessed by another party,
the individual must be able to exercise a substantial degree of control over that data and its use
(Clarke, 1999). One type of information privacy
is online privacy, which is defined as consumer
concerns about what data is being collected by
an online vendor about the customer and how it
will be used (Nyshadham, 2000). Compared
to an off-line environment, the Internet enables
organizations to collect more information from
consumers cost effectively, sometimes even without the consent of consumers. The Internet poses
greater security threats for consumers as their personal information is transmitted over the Internet
if an organization does not have a good security
mechanism in place. Furthermore, the connectivity of the Internet allows organizations to capture
and build electronic profiles of consumers and
potential consumers. Therefore, consumers today
are facing a high level of privacy threat/invasion.
One way to show an organizations commitment
to protect consumers online privacy is to post
an online privacy policy and follow the policy
truthfully. Online privacy has been viewed as a
significant factor contributing to consumer trust
and therefore an imperative for business success
(Privacy & American Business, 2002). However,
its provision is often at odds with organizational
goalssuch as the maximization of personal
information value obtained from disclosure to
third parties (often for commercial gain) and
the retention of customer loyalty via enhanced
personalized services (Lichtenstein, Swatman,
& Babu, 2003).

0

The confrontation of individual versus organizational privacy prospective has started to


drawn social and governmental attention. The
Federal Trade Commission (FTC) has brought a
number of principles to enforce the promises in
organizations privacy statements (FTC, 1998;
FTC, 2005). The FTC suggests a set of principles
regarding collection, use, and dissemination of
information which will ensure fair information
practices. These principles include four core
principles called notice, choice, access, and security. The implementations of these principles
are as follows: first, organizations should tell
consumers what information they collect and
how it will be used (notice); second, consumers
should be offered a choice about having their
personal information used for other unrelated
purposes or shared with third parties (choice);
third, consumers should be able to review their
personal information and have errors corrected
(access); finally, organizations should protect the
personal information they collect (security). If an
organization follows all these principles, it can
then be said to follow fair information practices
(Nyshadham, 2000). Fair information practices
have been used as a standard to evaluate the online
privacy policy of organizations in several studies
(Nyshadham, 2000).
Although online privacy issues have drawn
social and governmental attention, the legislation
of online privacy protection has not been fully
implemented within the increasingly globalized
e-commerce world. The European Union Directive on Privacy and Electronic Communications
(EU Directive 2002/58/EC) has been adopted by
EU. However, implementation of the EU directive
by the member states has been slow because of
resistance such as considerable increase of the
interest in the use (and retention) of traffic data
by law enforcement authorities (EDRI, 2004).
Although the U.S. Federal Trade Commission
(FTC, 1998) has published a guideline to enforce
the promises in organizations privacy statements
(FTC, 2005) and many bills related to consumer

An Analysis of Online Privacy Policies of Fortune 100 Companies

privacy are currently reviewed by the congress


(CDT, 2005), the U.S. Online Privacy Protection Act is still in a proposal (Baumer, Earp,
& Poindexter, 2004). As a result, the current
online privacy protection legislations taken effect vary by industries and states. Among these,
the Childrens Online Privacy Protection Act
(COPPA) and Health Insurance Portability and
Accountability Act (HIPAA) have already taken
effect (Desai, Richards, & Desai, 2003). The
privacy provisions of the Gramm-Leach-Bliley
Act (GLB Act) requires disclosure of a financial
institutions privacy policy to consumers and
requires that the institution provide the consumer
an opportunity to opt out of any disclosures of
non-public personal information to non-affiliated
third parties (Wolf, 2004). The California Online
Privacy Protect Act (COPPA) passed on July 1,
2004 requires the Web sites to post the online
privacy policy (OPP) conspicuously with effective
dates, in addition to other requirements similar to
the FTC guideline and applies to any operator of a
Web site or online service that collects personally
identifiable information from consumers residing
in California (Englund, 2004).
Meanwhile, private organizations have been
working on regulating online privacy protection
for many years. Several third party privacy seals
along with consulting service are available currently. These include TRUSTe, BBB (Better Business Bureau) Online, CPA Web Trust, and ESRB
(Entertainment Software Rating Board) Privacy
Online. Both TRUSTe and BBBOnline were
founded in 1997. The European Commissions
directive on data protection went into effect in
October of 1998 and prohibits the transfer of
personal data to non-European Union nations
that do not meet the European adequacy standard for privacy protection (BBBOnline, 2005).
Therefore, non European Union companies who
need to transfer the customer information transAtlantic must comply with the European Unions
safe harbor framework approved by the EU
in July of 2000. These four types of privacy

certificate programs reflect the self-regulation


of the industry.
Surveys have shown that consumers are
concerned about the collection and use of their
personal information when using e-commerce
Web sites and will stay away from the electronic
marketplace until they are confident their personal
information is protected and used properly (FTC,
1998). As a result, more and more companies
have posted an online privacy policy in order
to provide customers with a sense of security
regarding the protection of their personal data.
In addition, organizations have begun to obtain
some third-party privacy seals of trust. Several
studies have focused on the investigation of the
online privacy policy of organizations (Culnan,
1999; Nyshadham, 2000; Desai et al., 2003; Lichtenstein et al. 2002; McRobb & Rogerson, 2004).
Those studies indicated that significant differences
exist among the length, language, and content of
privacy policies of organizations.
The purpose of this study is to perform an
up-to-date survey of online privacy policies of
Fortune 100 companies with the goals of increasing our understanding of current online privacy
practices, identifying the common deficiencies,
and providing suggestions for future improvement
of online privacy policies.

bAckground
An online privacy policy (OPP) is a key organizational measure for assuring online privacy for Web
site users (Lichtenstein et al., 2003). Serving as
the high level guideline of the information privacy
of an organization, the promises made in OPPs
varied from each other, reflecting the difference
in the organizational recognition of the online
privacy issues and the privacy practices.
Several studies have focused on the investigation of OPPs using survey methodology (Culnan,
1999; Nyshadham, 2000; Desai et al., 2003; Lichtenstein et al., 2002; McRobb & Rogerson, 2004).



An Analysis of Online Privacy Policies of Fortune 100 Companies

The purposes of these studies are to understand


the current practices, compare the differences,
identify the deficiencies, and offer suggestions
for future improvement of online privacy policies. These studies suggest the research design
and methodologies for conducting privacy policy
survey and thus provide a good starting point for
our research. We will discuss those studies briefly
in the following paragraphs.
Culnans report to the FTC (Culnan, 1999)
studied 300 dot-com Web sites to investigate
the self regulation status of these companies in
compliance with the FTC guidelines. The report
found that although the majority of these dotcom Web sites (79.7%) posted privacy policies,
their compliance with the FTCs guidelines was
poor: only 22% provide consumers with notice
regarding collection of personal information. On
the important issue of providing individuals with
the capacity to control the use and disclosure of
personal information, the survey found that 39.5%
say that consumers can make some decision about
whether they are re-contacted for marketing
purposes and fewer still, 25%, say they provide
consumers with some control over the disclosure
of data to third parties.
Nyshadham (2000) studied the online privacy
policies of 23 airlines, focusing on the fair information practices suggested by the FTC (FTC, 1998).
His research indicates that significant differences
exist among majors, nationals and intermediaries
in the privacy practices. The air travel industry
seemed to fall behind other industries in implementing fair information practices. According to
his research, all the surveyed firms collect customer information and 90% of them collect both
personal identifying information and demographic
information. The result showed 100% of online
air travel intermediaries (agents) posted privacy
policies. While in contrast, 30% of major airlines
and 92% of national airlines collected customer
information without providing a privacy policy
in 2000. The results also showed that no privacy
policy statement followed all four principles and



again the ordering from best to worst practice is


online intermediaries, major airlines, and then
national airlines. In addition, a few firms (one
intermediary, one major, and no national) used
some type of privacy seals of trust.
Lichtenstein et al. (2002) analyzed the OPPs
of 10 companies in various industries in America
and Australia. The authors focused on identifying the deficiencies in OPPs. Instead of using
statistical methods, the authors performed a
qualitative analysis and in-depth case studies.
Trends, patterns, and differences are captured in
the categories of: awareness, data quality, security, information movement, user identification,
accountability, user access, assurance, contact,
choice, change management, childrens privacy,
sensitive information, and exceptions. The authors
then provided detail guidelines for improvements
in each category.
Desai et al. (2003) researched the OPPs of over
40 e-commerce companies. The OPPs were evaluated based on the five privacy policy categories:
privacy, returns, shipping, warranty, and security
using a scale of 0 to 5. Their study also performed
longitudinal comparisons from 1999 to 2001
and showed that e-commerce companies have
improved communications of these policies with
their customers significantly over the years.
McRobb and Rogerson (2004) performed an
empirical survey of 113 OPPs belonging to different industries and countries. They suggested
that a five point scale is too simple to produce
useful insights for the complex content of many
privacy policies. The authors used statistical and
subjective methods to evaluate the OPPs in terms
of personal information collection, disclosure to
third parties, customer consent, contact information, check/amend procedure, data protection,
cookies, childrens issues, trust seal, and further
privacy advice. The study performed a regional
comparison of EU and North America and concluded that EU polices are more prominent but
are less informative. The study also conducted
industrial sector comparisons and showed that

An Analysis of Online Privacy Policies of Fortune 100 Companies

retail and Internet services performed better


than the average, while travel/tourism and public
utilities performed worse than the average. It can
be seen that previous studies on online privacy
policies all use the four principles suggested by
the FTC to evaluate the quality of online privacy policies. However, some studies focus on a
specific industry such as airlines (Nyshadham,
2000), some just focus on e-commerce companies
(Culnan, 1999; Desai et al., 2003); some include
less than 50 samples in their studies (Nyshadham,
2000; Desai et al., 2003; Lichtenstein et al., 2002).
No study could be found which has investigated
the online privacy policies of Fortune 100 companies. The purpose of this chapter is to fill this
gap by providing the current status of online
privacy policies of the Fortune 100 companies
which represent the largest companies in various
industries in the U.S. and thus will enhance our
understanding of privacy policy practices for the
largest companies in the U.S.

mAIn thrust of the chApter


research methodology and data
collection
The Fortune 100 companies are chosen as the
target sample in our study (see Appendix). Four
questions will be investigated: (1) How many Web
sites surveyed have posted a privacy policy?; (2)
What personal information is collected by the
Web site from consumers?; (3) Do those privacy
disclosures follow fair information practices?;
and (4) How many Web sites use third-party seals
(such as Truste and BBBOnline) to provide privacy
assurances? The set of questions for measuring
fair information practices were adopted from
Nyshadham (2000) (see Table 2), with additional
questions regarding privacy seals and more recent
data (collected in February, 2005). This instrument
has also been used in several privacy studies, including the privacy study conducted by the FTC

(FTC, 1998), the Culnan report to the FTC (Culnan, 1999), and Online Privacy Alliance (OPA,
2005). The data collection process was as follows:
First, the Web site of each Fortune 100 company
was visited by the authors. Second, the online
privacy policy of each Web site, if available, was
reviewed and evaluated carefully by the authors
based on the four groups of questions described.
To guarantee the validity and consistency of data
collected, the authors first developed and agreed
on the way of interpreting the information and
one author would double check with the other if
any discrepancies arose during data collection.
The same method was used in most studies of
online privacy policies.

data Analysis
The Number of Web Sites Having
Privacy Policy
It was found that out of 100 Web sites, six do not
post a privacy policy, since these companies usually do not have direct contact with consumers.
For the 94 companies having an online privacy
policy, five of them indicate that their Web sites are
only for the purpose of displaying information. In
sum, 89 companies have online privacy policy and
also collect consumer information online. Those
companies will be used in the later analysis.

Type of Information Collected


Table 1 shows that 82% of companies indicate that
they will collect personal information from the
consumers. Out of this group, some organizations
also indicate the type of information they collect.
For example, almost half of the organizations will
collect name, e-mail address, and postal address
from the consumers. In addition, the types of
information the companies collect also include
phone number, credit card number, age/date of
birth, family information, gender, education,
income, preferences, and occupation.



An Analysis of Online Privacy Policies of Fortune 100 Companies

It was also found that 79% of the companies


use cookies to collect non-personal information.
A cookie is a small data file that a Web site or
e-mail may send to a users browser, which may
then be stored on the users hard drive. The cookies
allow a Web site to recognize a user when he/she
returns to the same site to provide the user with
a customized experience. Most of the companies
indicate that they do not store personal information in the cookies.
In addition, 28% of the companies use Web
beacons, also known as clear gifs, or Web bugs,
which allow a company to collect certain information automatically. The collected information
may include a user Internet protocol (IP) address,
computers operating system, browser type, the
address of a referring Web site, and any search
terms a user may have entered on the site, among
other information. Companies may include Web

Table 1. Type of information collected by Web


sites
Number

Percentage

Personal information

73

82.0%

Name

53

59.6%

E-mail address

50

56.2%

Postal address

41

46.1%

Personal Information

Phone number

34

38.2%

Credit card number

18

20.2%

Social security number

5.6%

Age/date of birth

11

12.4%

Family information

3.4%

Gender

4.5%

Education

2.2%

Income

3.4%

Preferences/interests

9.0%

Occupation

7.9%

Cookies/tracers

70

78.7%

Clear Gifs/Web
beacons/Web bugs

25

28.1%

Non-Personal Information



beacons in promotional e-mail messages or newsletters in order to determine whether messages


have been opened and acted upon.

Fair Information Practices


Table 2 shows the status of fair information
practices for the Fortune 100 companies. It can
be seen that in general, Fortune 100 companies
have not followed all four core principles of fair
information practices fully. In relation to the
notice requirements, about 70% of the sites have
indicated what information will be collected
from consumers, how it collects information
from consumers, and how the information it
collected from consumers will be used. Regarding the choice requirements, 69% of the sites
indicate that they will use information the site
has collected to contact consumers for marketing
or other purposes. At the same time, about half
(52%) indicate that they will give consumers the
choice of whether they want to be contacted by
this organization for marketing or other purposes.
About 60% of the companies say that information
they collect from consumers may be disclosed to
outside third parties (e. g., advertisers, business
partners, or affiliates), however, only 23% of the
Web sites indicate that they will give consumers
the choice of opting out of the disclosure to outside
third parties. In addition, a small percentage of
the companies (12%) indicate that they will only
disclose the information in aggregate form to the
third parties.
Regarding the access requirements, more than
half (56%) of the companies provide consumers
some way to review and modify their personal
information, in most cases, through access to an
online account. However, only one third of the
companies (33%) mention how inaccuracies in
personal information collected are handled.
Compared to notice, choice, and access requirements of fair information practices, security
requirements receive the least attention from the
companies. Only 19% of the companies indicate

An Analysis of Online Privacy Policies of Fortune 100 Companies

Table 2. Fair information practices for Fortune 100 companies


Survey Question

Number

Percentage

Does the site say anything about what information it collects from consumers?

63

70.8%

Does the site say anything about how it collects information from consumers?

62

69.7%

Does the site say how the information it collected from consumers will be used?

56

62.9%

Does the site say that this organization may use information the site has collected to contact
consumers for marketing or other purposes?

61

68.5%

Does the site say that it gives consumers choice about whether they want to be contacted by
this organization for marketing or other purposes?

46

51.7%

Does the site say that the information collected from consumers may be disclosed to outside
third parties (e.g., advertisers, business partners, or affiliates)?

53

59.6%

Does the site say it only discloses this information to outside third parties in aggregate form?

11

12.4%

Does the site say it gives consumers choice about having collected information disclosed to
outside third parties?

20

22.5%

Does the site say that it allows consumers to review or modify the information that the site
has collected?

50

56.2%

Does the site say how inaccuracies with the personal information the site has collected are
handled?

30

33.7%

Does the site say anything about the steps it takes to provide security for information during
transmission?

14

15.7%

Does the site say anything about the steps it takes to provide security for information after the
site has received the information? (not during transmission, but after collection)

18

20.2%

Mentioned both

17

19.1%

SSL

26

29.2%

Contact Information

25

28.1%

Does the site say how to submit a question about privacy? (e.g., provide contact information)

44

49.4%

Does the site say how to complain to the company or another organization about privacy?
(e.g., provide contact information)

16

18.0%

Notice

Choice

Access

Security

Security Seal
Verisign

that they have taken steps to provide the security


for the information both during the transmission
and after the site has received the information.
Another 16% mention they have taken steps to
provide security for the information during the
transmission but do not mention how they handle
the security of the information after they receive

it, and another 20% of them indicate that they have


taken steps to provide the security after receiving
the information but without mentioning the security of the information during the transmission.
Almost half of the companies do not mention the
security of the information in their online privacy
policy at all. In terms of security standards and



An Analysis of Online Privacy Policies of Fortune 100 Companies

certificates, 29% indicate that they have used


secure socket layer (SSL) for the security of the
information and 5% indicate that they have used
VeriSign digital certificate during the transmission of the information.
The results also show that about 50% of the
companies provide information to consumers on
how to contact them about the privacy questions.
However, only 18% provide a way for consumers
to complain about privacy violations/concerns.
In sum, it can be seen that in general, Fortune
100 companies have partially followed the four
principles of fair information practices (notice,
choice, access, and security). They have given
consumer notice about the information they collect, the way they collect the information, and how
the information is used. They also give consumers
the choice of receiving marketing information
from them and allow customers to review and
modify the information they collect. However,
the majority of them do not give consumers the
choice of information disclosure to outside third
parties, and do not provide customers a way to
address the inaccuracies in personal information.
Moreover, most of them fall short in security
requirements.

Privacy Seals
Table 3 shows that a few firms have begun to use
third party privacy seals (19%), including TRUSTe
(9%), BBBOnline Privacy (8%), and Safe Harbor
(2%). These third-party firms survey an applying
companys online privacy practices and certify it.
A certified company can display the graphic seal
on its Web site. This is a form of self-regulation
that a company can abide.
In addition, four companies have support for
the platform for privacy preferences (P3P) standard (W3C, 2002), which can be used by Web
browsers to automatically shield users from sites
that do not provide the level of privacy protection
they desire. Using the P3P standard, a Web site
can publish a summary of the privacy practices



Table 3. Third-party privacy seals for the surveyed sites


Privacy Standard and Seal

Number

Percentage

TRUSTe

9.0%

BBBOnLine Privacy
(better business bureau online
privacy)

7.9%

Safe Harbor

2.2%

P3P

4.5%

of a Web site in machine-readable form. This


summary is interpreted by browsers equipped to
read P3P summaries. These browsers may then
display alerts or block transmission of information
to users if the Web sites privacy practices conflict
with the preferences of users.

Additional Observations of Online


Privacy Policy
Even though most companies post privacy policies, the length, language, and contents of the online privacy policies vary from one to another. On
one hand, organizations want to give consumers
the choice of how their personal information can
be used; on the other hand, organizations try to
maximize the use of personal information for
marketing and other purposes. This dilemma has
been reflected in the privacy policy statements of
some companies.
For example, the online privacy policy of State
Farm says You may contact your State Farm agent
if you prefer that we not share this information
within State Farm. Your choice does not limit
State Farm from sharing certain information about
your transactions with us (such as your name, address, and payment history) or your experiences
with us (such as your State Farm claim activity).
This choice does not apply to our efforts to market
products and services to youyou may receive
information about State Farm products that we
believe may suit your needs.

An Analysis of Online Privacy Policies of Fortune 100 Companies

The following sentences are quoted directly


from the privacy policy of Chase: Even if you
do tell us not to share, we may share other types
of information within our family. For example,
we may share name and address, information
about transactions or balances with us, as well
as survey results.
The third example is the privacy policy of
Microsoft, showing Microsoft.com may send
out periodic e-mails informing you of technical
service or security issues related to a product or
service you requested, or confirming you requested a product or service. In some services offered
by Microsoft.com, contact with the customer is
an intrinsic part of the service. You will not be
able to choose to unsubscribe to these mailings,
as they are considered an essential part of the
service(s) you have chosen.

future trends
As the Internet and e-commerce become an integrated part of doing business in todays digital
economy, and as companies are collecting more
personal information from consumers and using
the information for understanding consumers
needs and marketing their products, consumers
will be demanding a stricter protection of their
privacy which in turn forces companies to implement a high level standard in privacy protection.
For example, compared to the survey conducted
by Culnan in 1999, our research shows that more
companies have posted an online privacy policy
which implies the rise of privacy protection awareness from 1999 to 2005.

conclusIon
This chapter investigates the current status of online privacy policies of Fortune 100 companies. It
is found that 94% of the surveyed companies have
posted an online privacy policy and 82% of them

notify consumers of the collection of information


from them. The majority of the companies still
only partially follow the four principles (notice,
choice, access, and security) of fair information
practices. For example, most of the organizations
give consumers some notice and choice in terms
of the collection and use of their personal information. However, organizations fall short in security
requirements. Only 19% of organizations mention
that they have taken steps to provide security for
information both during transmission and after
receiving the information. The results also reveal
that a few organizations have obtained third-party
privacy seals including TRUSTe, BBBOnline
Privacy, and Safe Harbor.
The findings show that almost all the Fortune
100 companies have posted their online privacy
policy. However, room exists to improve their
online privacy practices. As increasing number
of consumers become concerned about how their
personal information is used in the electronic marketplace, a well designed and implemented privacy
policy is needed to build consumers trust and thus
increase online sales. Especially, companies need
to implement effective security measurement to
protect the information of consumers both during
the transmission and after the site has received
the information, which is lacking currently for
most of the surveyed companies.
Consumers need to be educated of the possible
privacy and security threats when submitting personal information online and to take a more active
role in protecting their online privacy. According
to Forrester Research, less than 1% of the visitors
to six major online travel sites during April 2001
actually read privacy policies (Regan, 2001). This
may explain why companies are not taking online
privacy policies seriously, since the impacts of
such policies on consumers online purchasing
are very limited. To enforce the implementation
of a good online privacy policy by organizations,
consumers at least need to read online privacy
policies before submitting personal information
online, and may report the violation of a privacy
policy to the company or a higher authority.


An Analysis of Online Privacy Policies of Fortune 100 Companies

Currently, the legislation of Internet privacy


protection has not been fully implemented. This
may explain partially why some of companies are
not following the fair information practices fully.
This has shown the limitations of industrial selfregulation in online privacy protection.
Our study has also shown the limitation
of the third party privacy seals. They are not
widely accepted (19%) and research shows that
these private seals of trust programs either fail
to be recognized and trusted by the online consumers (Moores, 2005), or are only effective on
inexperienced consumers (Kimery, 2002). And
these private seals of trust programs suffer from
the lack of resources for monitoring the member
organizations compliance of the seal requirements (Miyazaki, 2002). Therefore, although
there is a potential to improve consumers trust
and organizations privacy protection practice,
third-party private seals of trust may still be in
their trial-and-error period (Zhang, 2005).
In addition, technical privacy enhancement
such as P3P try to automate the privacy preference-policy compliance checking is limited by its
acceptance (4.5%). Other online privacy gadgets,
such as anonymous surfing tools (thefreecountry.
com, 2007), may be effective against tracking
cookies. However, when a consumer needs to
conduct an online transaction, he/she must submit personal identifiable information. Hence,
these anonymity gadgets also have significant
limitations.
Therefore, more mature and complete future
legislation for online privacy protection still seems
to be the only alternative in enforcing meaningful and effective online privacy policies. For the
time being, consumers should be aware of the
potential risk and therefore use caution (such as
reading the OPP) before compromising online
privacy for convenience.

8

future reseArch dIrectIons


Future research should be conducted to study
the impact of contextual factors, such as firm
size and type of industry on the quality of online
privacy policy. It is anticipated that large firms or
specific industrial sectors may require a stronger
privacy policy. In addition, future research can be
extended to study the practice of online privacy
policies in other countries and to see how they
are different from the U.S.
Another direction for future research may
focus on consumers perceptions and attitudes
toward online privacy policies, which have not
received enough attention in the literature. The
following questions can be investigated such as:
do consumers realize the existence of the online
privacy policy? Do consumers read the online
privacy policy of a company before purchasing
from it? Do consumers believe that a company
will follow its online privacy policy faithfully?
In addition, it will be of interest to study the
effect of online privacy policies on consumers.
For example, how does the quality of an online
privacy policy impact a consumers online trust,
which in turn, influences their intent to purchase
online? Besides online privacy, what are other
factors necessary in building online trust?
Another question may be how to increase
consumers awareness of online privacy policy
and online privacy protection as the study of Forrester Research shows that few consumers read
online privacy policies (Regan, 2001).

references
Baumer, D. L., Earp, J. B., & Poindexter, J.
C. (2004). Internet privacy law: a comparison
between the United States and the European
Union. Journal of Computers & Security, 23(5),
400-412.

An Analysis of Online Privacy Policies of Fortune 100 Companies

BBBOnline. (2005). European Union/US safe


harbor compliance. Retrieved March 25, 2005,
from http://www.bbbonline.org/privacy/eu.asp
CDT. (2005). Center for democracy and technology. Legislation Center. Retrieved March 25, 2005,
from http://cdt.org/legislation/
Clarke, R. (1999). Internet privacy concerns confirm the case for intervention. Communications
of the ACM, 42(2), 60-67.
Culnan, M. J. (1999). The Georgetown Internet
privacy policy survey: Report to the Federal
Trade Commission. Retrieved April 15, 2005,
from http://www.msb.edu/faculty/culnanm/gipps/
gipps1.pdf
Desai, M. S., Richards, T. C., & Desai, K. J.
(2003). E-commerce policies and customer privacy. Information Manage & Computer Security,
11(1), 19-27.
EDRI. (2004). EU report: member states lazy to
protect data. Retrieved August 20, 2007, from
http://www.edri.org/edrigram/number2.24/report
Englund, S., & Firestone, R. (2004). California law
regulates web site privacy policies. The Computer
& Internet Lawyer, 21(8), 22-23.
FTC. (1998). Privacy online: A report to congress.
Retrieved March 25, 2005, from http://www.ftc.
gov/reports/privacy3/priv-23a.pdf
FTC. (2005). Federal trade commission privacy
initiatives. Retrieved March 25, 2005, from http://
www.ftc.gov/privacy/index.html
Kimery, K. M., & McCord, M. (2002). Thirdparty assurances: mapping the road to trust in
e-retailing. Journal of Information Technology
Theory and Application, 4(2), 63-82.
Lichtenstein, S., Swatman, P., & Babu, K. (2003).
Adding value to online privacy for consumers:
remedying deficiencies in online privacy policies
with a holistic approach. In Proceedings of the

36th Hawaii International Conference on System


Sciences.
McRobb, S., & Rogerson S. (2004). Are they
really listening? An investigation into published
online privacy policies at the beginning of the
third millennium. Information Technology &
People, 17(4), 442-461.
Miyazaki, A., & Krishnamurthy, S. (2002). Internet seals of approval: effects on online privacy
policies and consumer perceptions. Journal of
Consumer Affairs, 36(1), 28-49.
Moores, T. (2005). Do consumers understand the
role of privacy seals in e-commerce? Communications of the ACM, 48(3), 86-91.
Nyshadham, E. A. (2000). Privacy policy of air
travel web sites: a survey and analysis. Journal of
Air Transport Management, 6(3), 143-152.
OPA. (2005). Online privacy alliance: guidelines
for online privacy policies. Retrieved March 25,
2005, from http://www.privacyalliance.org/resources/ppguidelines.shtml
Privacy & American Business. (2002). Privacy on
and off the internet: what consumers want.Privacy
& American Business. Hackensack, NJ.
Regan, K. (2001). Does anyone read online privacy
policies? Ecommerce Times. Retrieved October
2007, from http://www.ecommercetimes.com/
perl/story/11303.htmlthefreecountry.com (2007).
Free anonymous surfing. Retrieved October 25,
2007, from http://www.thefreecountry.com/security/anonymous.shtml
W3C. (2002). The platform for privacy preferences
1.0 (P3P1.0) specification. Retrieved October 25,
2007, from http://www.w3.org/TR/P3P/
Warren, S., & Brandeis, L. (1890). The right to
privacy. Harvard Law Review, 4(5), 193-220.
Wolf, C. (2004). Californias new online privacy
policy law has nationwide implications. Journal
of Internet Law, 7(7), 3-8.

9

An Analysis of Online Privacy Policies of Fortune 100 Companies

Zhang, H. (2005). Trust-promoting seals in


electronic markets: impact on online shopping.
Journal of Information Technology Theory and
Application, 6(4), 29.

AddItIonAl reAdIngs
Antn, A. I., Bertino, E., Li, N., & Yu, T. (2007).
A roadmap for comprehensive online privacy
policy management. Communications of the ACM,
50(7), 109-116.
Anton, A. I., Earp, J. B., He, Q. F., Stufflebeam,
W., Bolchini, D., & Jensen, C. (2004). Financial
privacy policies and the need for standardization.
IEEE Security & Privacy, 2(2), 36-45.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy
policies: an empirical perspective on self-regulatory practices. Journal of Electronic Commerce
in Organizations, 3(4), 61.
Brown, D. H., & Blevins, J. L. (2002). The safeharbor agreement between the United States
and Europe: A missed opportunity to balance
the interests of e-commerce and privacy online?
Journal of Broadcasting & Electronic Media,
46(4), 565.
Chen, K., & Rea, A. L., Jr. (2004). Protecting
personal information online: a survey of user
privacy concerns and control techniques. Journal
of Computer Information Systems, 44(4), 85.
Cockcroft, S. (2002). Gaps between policy and
practice in the protection of data privacy. Journal
of Information Technology Theory and Application, 4(3), 1.
Cranor, L., Guduru, P., & Arjula, M. (2006).
User interfaces for privacy agents. ACM Transactions on Computer-Human Interaction, 13(2),
135-178.
Liu, C., Marchewka, J. T., & Ku, C. (2004).
American and Taiwanese perceptions concerning

80

privacy trust and behavioral intentions in electronic commerce. Journal of Global Information
Management, 12(1), 18.
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004).
Internet users information privacy concerns
(IUIPC): the construct, the scale, and a causal
model. Information Systems Research, 15(4),
336355.
Mascarenhas, O. A., Kesavan, R., & Bernacchi,
M. (2003). Co-managing online privacy: a call for
joint ownership. Journal of Consumer Marketing, 20(7), 686.
Meinert, D., Peterson, D., Criswell, J., & Crossland, M. (2006). Privacy policy statements and
consumer willingness to provide personal information. Journal of Electronic Commerce in
Organizations, 4(1), 1.
Metzger, M., & Docter, S. (2003). Public opinion
and policy initiatives for online privacy protection.
Journal of Broadcasting & Electronic Media,
47(3), 350.
Milne, G., & Culnan, M. (2002). Information society, using the content of online privacy notices to
inform public policy: a longitudinal analysis of the
19982001 U.S. Web Surveys, 18(5), 345-359.
Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217.
Pan, Y., & Zinkhan, G. (2006). Exploring the
impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Papacharissi, Z., & Fernback, J. (2005).Online
privacy and consumer protection: an analysis of
portal privacy statements. Journal of Broadcasting & Electronic Media, 49(3), 259.
Peslak, A. (2006). Internet privacy policies of
the largest international companies. Journal of
Electronic Commerce in Organizations, 4(3),
46-62.

An Analysis of Online Privacy Policies of Fortune 100 Companies

Pollach, I. (2005). A typology of communicative


strategies in online privacy policies: ethics, power
and informed consent. Journal of Business Ethics,
62(3), 221-235.

Rowland, D. (2003). Privacy, freedom of expression and cyberslapps; fostering anonymity on the
internet? International Review of Law Computers
& Technology, 17(3), 303-312.

Pollach, I. (2006). Privacy statements as a means


of uncertainty reduction in www interactions.
Journal of Organizational & End User Computing, 18(1), 23-49.

Schuele, K. (2005). Privacy policy statements


on municipal websites. Journal of Government
Financial Management, 54(2), 20.

Pollach, I. (2007). Whats wrong with online


privacy policies? Communications of the ACM,
50(7), 103-108.
Roussos, G., & Theano, M. (2004). Consumer
perceptions of privacy, security and trust in
ubiquitous commerce. Personal and Ubiquitous
Computing, 8(6), 416.

Shah, J., White, G., & Cook, J. (2007). Privacy


protection overseas as perceived by USA-based
IT professionals. Journal of Global Information
Management, 15(1), 68-81.
Wolf, C. (2004). Californias new online privacy
policy law has nationwide implications. Journal
of Internet Law, 7(7), 3-8.

8

An Analysis of Online Privacy Policies of Fortune 100 Companies

AppendIX
Fortune 100 Companies (Accessed in February 2005)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.

8

Wal-Mart Stores
Exxon Mobil
General Motors
Ford Motor
General Electric
ChevronTexaco
ConocoPhillips
Citigroup
Intl. Business Machines
American Intl. Group
Hewlett-Packard
Verizon Communications
Home Depot
Berkshire Hathaway
Altria Group
McKesson
Cardinal Health
State Farm Insurance Cos
Kroger
Fannie Mae
Boeing
AmerisourceBergen
Target
Bank of America Corp.
Pfizer
J.P. Morgan Chase & Co.
Time Warner
Procter & Gamble
Costco Wholesale
Johnson & Johnson
Dell
Sears Roebuck
SBC Communications
Valero Energy
Marathon Oil
MetLife
Safeway
Albertsons
Morgan Stanley
AT&T

41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
59.
60.
61.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
80.

Medco Health Solutions


United Parcel Service
J.C. Penney
Dow Chemical
Walgreen
Microsoft
Allstate
Lockheed Martin
Wells Fargo
Lowes
United Technologies
Archer Daniels Midland
Intel
UnitedHealth Group
Northrop Grumman
Delphi
Prudential Financial
Merrill Lynch
DuPont
Walt Disney
Motorola
PepsiCo
CVS
Viacom
Sprint
Sysco
Kmart Holding
TIAA-CREF
American Express
New York Life Insurance
International Paper
Tyson Foods
Wachovia Corp.
Goldman Sachs Group
Duke Energy
Honeywell Intl.
Caterpillar
Best Buy
Johnson Controls
BellSouth

An Analysis of Online Privacy Policies of Fortune 100 Companies

81.
82.
83.
84.
85.
86.
87.
88.
89.
90.

Ingram Micro
FedEx
Merck
ConAgra Foods
HCA
Alcoa
Electronic Data Systems
Bank One Corp.
Comcast
Mass. Mutual Life Ins.

91. Coca-Cola
92. Bristol-Myers Squibb
93. WellPoint Health Networks
94. Georgia-Pacific
95. Weyerhaeuser
96. Abbott Laboratories
97. AutoNation
98. Williams
99. Supervalu
100. Cisco Systems

8

8

Chapter XIV

Cross Cultural Perceptions on


Privacy in the United States,
Vietnam, Indonesia, and Taiwan
Andy Chiou
National Cheng Kung University, Taiwan
Jeng-chung V. Chen
National Cheng Kung University, Taiwan
Craig Bisset
National Cheng Kung University, Taiwan

AbstrAct
In this chapter, the authors will briefly discuss some cross cultural concerns regarding Internet privacy.
The authors believe that due to the cross cultural nature of the Internet itself, different cultures will tend
to result in different concerns regarding Internet privacy. As such, there is no single system of protecting
Internet privacy that may be suitable for all cultures. The authors also utilize focus groups from various
countries spanning Asia and the United States to discover the differences between cultures. Hopefully an
understanding of such differences will aid in future research on Internet privacy to take a more culture
sensitive approach.

IntroductIon
As the worlds population becomes increasingly
plugged into the Internet, many of the new and

wondrous capabilities that the Internet has come


to offer have come under serious scrutiny and
debate as double-edged swords. File sharing,
addiction, freedom of speech, and online gaming

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

are just a handful of topics that have spurned


heated discussions. More often than not, one
side of the debate invariably comes to involve
underlying privacy issues. One example is the
popularity of blogs. Hailed by many as a means
of transforming journalism and disseminating
information (Dietrich, 2006), blogs have also been
scrutinized by companies and organizations as
things that must be tightly controlled as a means
of corporate strategy (Hanson, 2006). However,
the question that then arises is: How private are
personal blogs? Should an employees personal
blog also be restricted in content? Should an
individual be held publicly accountable for an offside comment in a personal blog? As organizations
are trending towards restricting employee blogs
and in severe cases firing employees for personal
blogs (Horwedel, 2006), the debate has moved
onto free speech and privacy concerns. As complicated as the debate is, the issue of blogs is but
a small slice of privacy concerns that plague the
Internet. One subject that looms on the horizon,
and yet has not been seriously challenged, is the
question of Internet privacy across cultures. The
views and laws governing Internet privacy in the
United States differ from those in the European
Union, which in turn differ from those in many
Asian countries. The cross-borders nature of the
Internet means consideration must be taken on
different cultural perceptions on this matter. This
chapter will briefly discuss some of the concerns in
approaching privacy on the Internet, in particular
how different cultures view and handle privacy.

bAckground
Just as the Internet has allowed for the instant
transmission of much needed information, it has
also become a channel for some unsavory elements. Organizations and individuals can now
collect information on individuals with speed,
ease, and relative accuracy. Masses of unwanted
solicitation e-mails, commonly known as spam,

have been identified by many as a scourge upon


the Internet (Herbert, 2006). Also of concern
are various scams that are precipitated over the
Internet, the most famous probably being the 419
scam, more commonly known as the Nigerian
scam, where the e-mail recipient is asked to provide private information in exchange for large
amounts of cash money. While at first glance this
scourge may simply be an annoyance for ISPs
and e-mail inboxes, privacy is also an underlying
concern here. Most companies who participate in
spam and scams usually obtain e-mail lists that
are available for purchase commercially. E-mails
on such lists may be harvested manually, or more
commonly, simply obtained when Internet users
carelessly disclose their e-mail addresses.
However, while arguments rage over various
sides of the privacy issues, be it debates over the
blurring of public and private space and ethical
responsibilities (Tavani & Grodzinsky, 2002),
the legal and ethical battle between right to know
and right to privacy (Sitton, 2006), or clashes on
whether Internet service providers (ISPs) are required to disclose user information to the Record
Industry Association of America (RIAA), little
attention has been paid to how different countries
react to the issue of Internet privacy. In the United
States, several years of unsolicited telephone calls
and spam e-mail have resulted in the National
Do Not Call (DNC) Registry, the CAN-SPAM
act, and the Spy Act of 2007 (H.R. 964), and Verizon, a United States cellular telephone service
provider, has recently won a court case against
cell phone spammers. While the intentions and
effectiveness of acts such as CAN-SPAM and
H.R. 964 are fiercely debated, it is important to
recognize that for better or for ill, legal measures
have been taken in an attempt to curb or at least
control matters involving privacy and spam.
In Asia however, measures such as DNC
and CAN-SPAM are nowhere to be seen. Many
residents of Taiwan complain privately about receiving telephone calls from scammers, and spam
e-mail is as prevalent as ever. However, no attempt

8

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

has been made to legislate unsolicited phone calls


and e-mails, as Americans have. While most Taiwanese citizens view of such spam as nuisances,
little thought is spent on how personal mobile
phone numbers, e-mail, and family information
is obtained by spammers and scammers in the
first place. This is very different from American
attitudes, where thoughts of spam and scams bring
to mind fears that personal information is being
freely peddled on the open market. At present,
no in depth empirical research has yet been done
on these differences.
Capurro, in his 2005 study of intercultural
views on privacy, provided a brief philosophical
glimpse at the possible reasons behind differences
in Japanese and German views on privacy, focusing on how these two different cultures viewed the
concept of private space. Dinev, Bellotto, Hart,
Russo, Serra, & Coluatti (2006) also looked into
the differences in privacy concerns between Italy
and the United States, utilizing a more empirical approach by comparing cultural differences
and privacy concerns via questionnaires. Most
comparative studies done thus far have used the
cultural model developed by Hofstedes (1980)
IBM study for comparison.

Issues And controversIes


Millions of Internet users today post photos of
themselves and their family on the Internet, or
release important information of themselves by
shopping online, essentially disregarding their
anonymity when using the Internet (Hsu, 2006).
Has privacy become less important in the age
where information is readily available at the click
of the mouse? Privacy is the ability to control the
conditions under which your personal information is collected. It is the right of an individual to
protect himself from intrusion into his personal
life. To say that the Internet has created new
problems for individual privacy is to state the
obvious. Personal information that was previously

8

inaccessible now is, and in large quantities with,


a high level of detail.
In the post September 11th world, the right
to privacy is no longer the issue as the spotlight
has been cast upon the right to information. The
right to privacy clashes with security and people
are more inclined to give up personal information
to feel more secure (Hsu, 2006). Legal action to
protect individual privacy is scarce, not to say that
action has not been taken, but has been limited. As
of this writing, no single standard global privacy
policy has yet to emerge. Each country has a different perspective on how to deal with the issue
of privacy, with some ignoring it completely. This
has led to a broad divergence in approaches.
In the United States, the creation of the National
Do Not Call Registry (DNC), the CAN-SPAM Act,
the Internet Spyware Prevention Act of 2007, and
also various court decisions have all established
precedence for protection of privacy. Direction is
clear, but whether it is enough is debatable.

national do not call (dnc) registry


The National DNC was created to give individuals a choice on whether they were willing
to receive telemarketing calls at home. After
a phone number has been registered with the
DNC for 31 days, telemarketers may not make
unsolicited calls to the registered number. If the
registered does receive an unsolicited phone call,
complaints may be filed at the DNC Web site. The
complaint is then handled by the Federal Trade
Commission (FTC). Registered numbers will be
protected for up to 5 years after which they will
have to be registered again (National Do Not Call
Registry, 2003). However, the National DNC has
several limitations, first and foremost being that
the DNC is limited to commercial entities. Nonprofit and survey organizations are still able to
make unsolicited calls. While some decry this
has a loophole in the DNC, others view this as an
essential protection of free speech. Thus far, the
National DNC Registry has elicited fairly positive
responses from the general U.S. public.

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

cAn-spAm Act
The CAN-SPAM Act went into effect in January,
2004. It covers e-mails that have a primary purpose of advertising. This act establishes requirements for companies sending out commercial
e-mail and explains clearly the punishments for
spammers and companies who do not comply
with regulations set in the act. These regulations
include: banning of misleading header information, statement of originating domain name and
e-mail address, and banning of deceptive subject
lines. The act also gives consumers the right to
ask advertisement companies to stop sending
advertising in the form of spam to them. Every
spammer or company must have an opt-out option
for consumers to choose. Lastly, it requires that
the e-mail be identified as an advertisement and
include the senders postal address (CAN-SPAM
Act, 2003). However, one severe limitation to the
CAN-SPAM Act is that it is not able to effectively
stop spam, which is what most Internet users would
wish for. Instead, the CAN-SPAM Act grants
spam a legitimacy that was previously lacking.
Spammers are now merely required to openly
state the purpose of their e-mails, sending out
e-mails entitled Advertisement: BUY VIAGRA
NOW instead of thinly veiled e-mails. In doing
so, spammers become law abiding citizens, albeit
something of a nuisance. The second limitation
of the CAN-SPAM Act is that the act is not
enforceable outside of the United States, where
most spam originates.

spy-Act of 2007
In the past few years, anti-spyware efforts have
gained momentum. The Spy Act bans the more
obvious forms of spyware such as the ones that
hijack a persons computer or log keystrokes. Keystroke logging software records all the text typed
onto the keyboard and runs completely hidden
without the person knowing it. However, there
are issues with the Spy Act. Many people point

to the limitations of the act to find the problems


associated with it. The limitations essentially give
hardware, software, and network vendors carte
blanche to use spyware, even some of the more
blatant ones, the right to monitor their customers
use of their products and services. Most see the
Spy Act as merely an extension of the CAN-SPAM
Act in that it merely gave the real perpetrators of
spam and spyware a legal mandate. Whether or
not this was the original intention is debatable,
however, it is true that CAN-SPAM and the Spy
Act contain loopholes that can be exploited by
various parties. Therefore, the act has come under
a lot of criticism and for good reason (Internet
Spyware Prevention Act, 2007).
The legislations stated none withstanding,
nowhere has Internet privacy been debated more
heatedly, and used most often as a means of defense
in court, than cases involving file sharing. The
ease at which digital copies can be created has
made the prevention of copyright violation increasingly difficult. As a result, copyright holders
such as the Recording Industry Association of
American (RIAA), Motion Picture Association
of American (MPAA) in the United States, and
their global counterpart, the IFPI, have shifted
their focus from large scale organizations to individuals who make copyrighted works available
over the Internet. Copyright holders now often
locate, investigate, and sue individuals who make
copyrighted works available over the Internet.
There have been court proceedings by copyright
holders around the world, in an attempt to stop
file sharers from putting copyright works on the
Internet to share. However, in order to do so, the
IP address of the file sharer must be identified and
subsequently linked to a specific address before
search and seizure can be conducted. At this point
is where the privacy debate usually comes in.
Copyright holders state that ISPs must disclose
all information when requested, while defendants
argue that for ISPs to do so would be in violation
of user privacy. Listed is a sampling of influential
court decisions in the United States.

8

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

verizon vs. recording Industry


Association of America
In July of 2002, the Recording Industry Association of America (RIAA) issued a subpoena to
Verizon Online requesting the phone company to
reveal the identities of customers who the RIAA
believed were in possession of copyrighted material. Verizon refused to corporate with the subpoena on the grounds that the Digital Millennium
Copyright Act states that copyrighted material
must actually reside on Verizon Onlines system
or network in order for Verizon to have to comply
by law. By simply being an ISP, Verizon Online
was not accountable for the actions of its users.
Therefore, since the material that was allegedly in
violation of copyright laws according to the RIAA
existed on customers computers, Verizon did not
have to comply with the subpoena. In response,
the RIAA sued Verizon and obtained the courts
agreement that Verizon had to comply with the
RIAAs subpoena. Verizon immediately appealed
in the U.S. Court of Appeals on the grounds that
the RIAAs single overarching subpoena for all
users was in violation of the Constitution. In
order for the RIAAs request to have been within
Constitution limits, the RIAA would be required
to file a subpoena for every user infraction it had
evidence for. The Court of Appeals agreed with
Verizons argument and struck the case down.
Verizons appeal lay down important groundwork.
Had Verizon not won the appeal, the door would
have been open for anyone who suspected a user of
copyright violation to obtain privacy information
without specifying exactly what or who was under
suspicion (Recording Industry Association of
America v. Verizon Internet Services).

capitol vs. does 1-16


In the latest series of RIAA court cases against
file sharers, the ruling issued by Judge Lorenzo
F. Garcia in the U.S. District Court of Mexico
shows that Verizon v. RIAA was not an isolated

88

event in the U.S. Judge Lorenzo ruled that the


RIAAs tactic, identical to that used in Verizon
v. RIAA, could not guarantee an individuals due
process rights. That is to say that the RIAAs tactic
could not guarantee that an individual charged
in a single overarching subpoena issued against
an entire group of people could be aware of the
subpoena issued against him, and would therefore
not have a chance to challenge the subpoena in
court for self protection (Capitol v. Does 1-16). This
further extends the legal precedence in protection
of online individual privacy. Organizations or
entities could not issue overarching subpoenas
in order to embark on a fishing expedition in
individual user information.

comparative Analysis of u.s. and eu


Workplace privacy laws
One of the fundamental differences between U.S.
and EU privacy laws is in the approach taken by
regulators in the two jurisdictions. The EU takes
a data protection approach where as the U.S. has
more of a civil rights approach to privacy (King,
Pillay, & Lasprogata 2006). Generic information
in the workplace is treated as a civil rights issue
in the United States. Legislation is then designed
to address misuses of genetic information since it
is considered workplace discrimination. Acquiring and using employees genetic information
for making employment decisions is a form of
discrimination. In contrast, the EU, following a
data protection approach, recognizes that individuals have a right to control their personal data
and requires employers to justify any processing
of that data based on legitimate business needs
(King et al., 2006). It closely mirrors a privacy
approach by recognizing that individuals have
legitimate interests in controlling their personal
data and therefore allows individuals to make
decisions on the use of their data and whether it is
being used for appropriate means. The employer
can however justify use of the personal data for
legitimate business needs.

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

It is important to look at the weaknesses


with both approaches. The most disconcerting
weakness of the data protection approach by the
EU is when genetic information is used unfairly
by EU employers. Legislation does not directly
address this issue. If an employer justifies its
collection and use of employees personal data
within the framework of the designed legislation
of data protection, then the privacy directive will
not regulate unfair uses of genetic information.
People would argue that if employers cannot obtain personal data of their employees then they
cannot use the data inappropriately. However,
others would point out that since exceptions are
made to justify using employees personal data,
the inappropriate use of it will not be protected
(King et al., 2006).
The most disconcerting weakness of the civil
rights approach taken by the United States would
be that the Americans with Disabilities Act (ADA)
does not protect employees from discriminatory
use of their genetic information since only disabled
employees are protected by the scope of the this
law (King et al., 2006). Another major problem
with the ADA is that once an employer uses personal data of an employee inappropriately, there
are not any clear avenues for employees to take
after the fact. If an employer fails to maintain the
confidentiality of an employees records, therefore releasing genetic information, the breach of
ADAs confidentiality rules is not accompanied by
employment action. For example, if an employer
discloses an HIV-positive status of an employee
and that employees co-workers learn of the positive status. Whether the disclosure was deliberate or not, the employee could suffer emotional
distress because private medical information was
released. However, if there was no harassment
involved or other detrimental employment action,
then it is argued that no employment discrimination has occurred.
While the United States has implemented the
DNC, the CAN-SPAM Act, and the Spy Act of
2007 with some amount of controversy, there has

been a movement in the direction to protect an


individuals information. The European Union
also has in theory very stringent laws protecting their citizens privacy as well. In contrast,
Asian countries have yet to take an in depth
look at privacy issues. Privacy laws protecting
citizens from spam, unsolicited phone calls,
and spyware are non-existent. Universities and
ISPs have also willingly provided information to
record industries in Asia, a marked contrast to
the behavior of Verizon who fought the RIAA
nearly to the very top of the U.S. court system.
While such technologies as RFID tags and the
proposed national ID card in the United States
have elicited concerns regarding violation of
privacy, Asian nations have been among the first
to adapt wireless credit cards and subway passes,
while the idea of a national ID card hardly merits
a bat of the eyelashes. Perhaps cultural differences
are the key to why privacy laws are not yet at the
forefront in Asian countries.

solutIons And
recommendAtIons
Since its inception in the late 1980s, the Internet
has created more questions than it answers about
how people from different cultural backgrounds
view privacy. It is a complex issue that wrestles
with many theoretical norms in regards to how
best to frame it, if at all. Hsu (2006) investigated
differences in privacy and cultural attitudes by
comparing two Eastern cultures, China and Taiwan, with two Western nations, United States and
Holland. Findings discovered that individuals in
each of the four countries possessed significant
differences not only in terms of reaction towards
different types of information gathering Web
sites, but also in terms of privacy concerns and
privacy practices. Users in the United States were
more likely to disclose information to non-profit
and commercial Web sites, and less likely to
government and health Web sites. Furthermore,

89

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

users in the United States demonstrated a


willingness to provide demographics information,
but unwilling to provide any sort of contact
information or official ID identification, regardless
of the type of Web site. Users in Taiwan and China
were more likely to disclose contact information
and official ID identification to government and
community Web sites, while unwilling to provide
the same information to health and non-profit
Web sites.
Many academics in the field of cross cultural
studies acknowledge that the major difference is
the noticeable bias of Western cultures toward
individualistic ideas on privacy, while Asian cultures move to a more collectivist view (Capurro,
2005). Hofstede (1997) provided one of the most
widely accepted frameworks for assessing national
cultural norms. By Hofstedes definition, there are
five dimensions that describe these cultural norms,
they are viewed as providing a stable means to
look at values and given behavioral patterns.
Hofstede is not without his critics, who view his
ideas as generalist in nature and lacking in detail
as to the specifics of national and regional cultures.
His stance of seeking out common themes has the
effect of simplifying cultural values at a macro
level. Most critics claim of a lack of complexity,
and the complete ignorance of situational factors
that influence people, and in turn are influenced
by those around them (McSweeney, 2002). This
line of argument calls for more detail, which allows the specific context of the situation and lifes
dynamic nature to intervene. This is a somewhat
naive view, as general themes need to drawn to
make cross cultural studies useful to academics
and decision makers. Governments and businesses
need clear ideas on cultural differences so quality
decisions can be made.
Milberg (2000) incorporated four of Hofstedes
dimensions into an index to measure privacy.
The index used masculinity, power distance,
uncertainty avoidance, and individualism. Issues
with privacy showed some interesting findings.
Uncertainty avoidance was found to be negatively

90

correlated with privacy issues, while masculinity,


power distance, and individualism were found to
be positively correlated (Milberg, 2000).
Low individualism cultures were found to
have greater acceptance of privacy invasion than
more individualistic cultures. Cultures with low
uncertainty avoidance tend to regard rules and
structure for privacy as unnecessary while the opposite can be said for high uncertainty avoidance
cultures, which gravitate towards more rules
and laws. Low masculinity societies place more
emphasis on personal relationships over using
personal information for economic gain. Power
distance shows that countries with lower power
distance scores have more trust in powerful interest groups, such as businesses (Bellman, Johnson,
Kobrin, & Lohse, 2004).
Privacy in itself is not a new issue. The rise
of the Internet accompanied by the ensuing technological stream of devices and communication
possibilities has instead contributed to the issue
of privacy. The ease at which it is possible to
disseminate information in our times, while remaining anonymous, has add to peoples general
awareness about how others may access to their
personal information is creating much interest in
how this should be dealt with.
This relatively new technological innovation is
affecting how people communicate and how they
view their own culture. The Internet has brought
together people who could not have done so in the
past. It is also creating glaring differences amongst
people, while changing the vary nature of how
we communicate. Cultures are always changing,
abet at a slow pace, but the Internet is potentially
doing this at an even greater rate.
Perception is a subjective thing, when a post
modernist view of the world is taken. The point
of view one takes decides how you approach a
given topic. Situational effects as outlined in the
conclusion of Hsus paper are the current stumbling block to drawing up a more well rounded
and explanatory theory of the balance of influence between national cultural dimensions and
individual situational events

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

Cultural ties are something we are born into,


a given time place and circumstances produces
a person with very specific qualities. No two
people are ever the same even if born into the
same family in the same place (Hofstede, 1997).
Of course, personality and the complex issues of
human development should not be underestimated.
But how does this all affect privacy and cultural
influences on it?
Hsu has pointed out this issue with glaring
clarity and we need to differentiate between national cultural differences and situational effects
by using a study of the various representational
systems and measurable unconscious frameworks
(Bandler, 2000). Bandler discovered the universality of peoples communication systems and the fact
that they do not vary from culture to culture.
In light of the various areas covered, we would
like to propose a basic framework in approaching
research on cross cultural privacy concerns. We
argue that instead of a universal set of privacy
laws, which would indeed be an ideal end-state
given the cross-borders nature of the Internet,
a cultural sensitive approach must be taken. In
order to do this, we propose a socio-political
approach to this discussion. First and foremost,
the very concept of privacy in different cultures
must first be defined.
As per Hofstedes (1997) conceptualization,
the United States ranks as one of the highest
countries in terms of individualism, while also
ranking low in power distance, and relatively low
in uncertainty avoidance. This is clearly reflected
in the general history of the United States. The
very core idea of the United States has been the
idea of the individual, the underdog, being able to
overcome the environment, competition, and even
superior powers, to beat The Man and succeed in
his own right. Historically, Americans have been
distrustful of a central government. Even in the
post-9/11 world with the ever expanding powers of
the federal government citizen groups, such as the
American Civil Liberties Unions (ACLU), have
been clamoring for more oversight over powers

regarding terrorism. Recent court decisions have


also made attempts in limiting the original broad
powers of the Patriot Act. Many universities have
also stated their intention to protect student privacy
in the event of RIAA lawsuits.
On the other side of the world, Taiwanese
culture is categorized as being collectivist,
relatively high in power distance scoring, and
also relatively high in uncertainty avoidance
(Hofstede, 1997). This is reflected in Chinese
tradition as individuals tend to be more trusting of
their government and superiors. However, this also
means the government and superiors are expected
to care for their citizens and subordinates, much
like how a parent would for a child. The political
structure also reflects this as oversight of government or even corporate powers is not as powerful
as it might be in the United States. This can be
seen in cases where individual privacy might be
an issue, such as a high profile case in 2001 when
14 students in a large public university in southern
Taiwan were prosecuted for copyright violation.
Students were not notified of incoming prosecutor
raids nor of their legal status, and most would
not find out until discovering their computers to
have disappeared while they were away at class.
Although some students did express dissatisfaction in regards to their privacy status and various
protests sprung up in campuses around the island,
public discussion quickly turned to condemning copyright violation and reminders to other
students to be law abiding citizens.
Hsus (2006) study has already indicated
that individuals from different cultures react
differently towards various types of Web sites, and
also display varying willingness in terms of the
type of information they provide on the various
Web sites. Dinev et al. (2006) utilized Hofstedes
(1997) dimensions in order to discover whether
differences among cultures led to different
perceptions on privacy, government intrusion, and
government surveillance in the post-9/11 world.
This study is also interesting in that it compares
the differences in regulatory style between

9

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

the United States and Italy, a European Union


member nation. Per Hofstedes dimensions, Italy
is, relative to the United States, a more collectivist
culture, while also being relatively higher in power
distance. Results from Dinev et al. indicate that
Italy possessed lower Internet privacy concerns.
Respondents in the United States possessed higher
Internet privacy concerns, which in turn were
strongly related to government intrusion concerns.
However, at the same time, U.S. respondents
were not strongly concerned about government
intrusion in and of itself. Dinev et al. explained
this as being a possible side effect of the post-9/11
world, in which U.S. citizens are willing to allow
for greater government surveillance powers in
exchange for greater security. However, we would
like to note that it is also possible the results were
caused by a sampling bias, as the U.S. sampling
obtained by Dinev et al. was from the southeastern
region of the United States, which, being notably
Republican Red states, would not have been
as critical and suspicious of the, at the time of
this writing, Republican federal government as
other regions might be. A sampling from the
northeast or from the west coast of the United
States dominated by Democratic Blue states
might have generated drastically different results.
However, the results from Dinev et al. are still valid
in that they point to cultural differences leading to
not only differences in privacy perception, but also
differences in perceptions towards government
actions regarding individual privacy. We would
therefore like to propose the following research
questions:

same time a lack of similar moves in Asia despite


the same problems serve to illustrate this. Milberg,
Smith, and Burkes (2000) study discovered a
tentative, albeit marginal, relationship between
culture and regulatory approaches taken towards
privacy; specifically a negative relationship
between power distance, and individualism and
regulatory approaches. A country with high
individualism would have less laws governing
privacy. However, this is not congruent with
actual events, as the United States, despite high
individualism, also has a comparatively high
amount of legislative protection for privacy. While
many may point to the patchwork approach of the
United States privacy protection as being proof
of less government intervention, one should note
the political system in the United States is essentially 50 separate systems governed by a federal
government. Legally, the federal government may
legislate matters involving interstate commerce;
however, each state may also have its own laws
that only apply within state borders. The recent
rejection of the national ID program in many
states, via state legislature due to privacy concerns,
would be such a scenario. In comparison, more
collectivist cultures such as China and Taiwan
have done very little in ways of protecting
individual privacy. We would therefore like to
propose the following research questions:

Research Question 2: The more collectivist the culture, the less legal protection for
individual privacy.

exploration via focus groups

Research Question 1: The more collectivist


the culture, the less concern for individual
privacy.

The culture of a nation is also influential in the


forming of the legal environment of a nation. The
United States with its myriad of legislation in an
effort to protect individual privacy, the European
Union also with its stringent directives, while at the

9

In order to tentatively explore whether our research


questions are valid, a limited exploration was
done via focus groups. A convenience sampling
of students from the Institute of International
Management (IIM) at a large public university
in southern Taiwan was used. The IIM is the
largest of its kind in southern Taiwan, calling
upon a faculty with experience and training from

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

the United Kingdom, the United States, Taiwan,


India, and Hong Kong. Students are drawn from
various countries, with over half of the student
population from South East Asia. The remaining
half is composed of various students from the
United States and Canada, the United Kingdom,
Africa, and Taiwan. A sampling from this student
population was done primarily to obtain the widest
possible range of cultures amongst participants. As
a result of this convenience sampling, education
levels, age, and socioeconomic status were also
controlled. Participants were all between 22 to
26 years of age, single, have at least obtained
a college level degree in their native country,
and currently pursuing a graduate degree in the
institute. Other than one group of participants
from the United States, English was a common
second language amongst the remaining
participants. Four culturally homogenous focus
groups of four individuals each were used, these
being: Vietnamese, Indonesian, Taiwanese, and
United States of American. Students were asked
to volunteer in a public announcement at the
student research rooms. Upon deciding upon
group compositions, each group was invited to
meet in an empty research room at different dates.
Moderators utilized a more structured approach
by asking questions regarding attitudes towards
online privacy, spam, and government regulation
towards online privacy. Discussions were limited
to 1 hour for each group. Before initiating the
official discussion topics, moderators engaged
the focus groups in casual conversation regarding
exposure to computers and the Internet. All
individuals in the focus groups were familiar with
using computers for e-mail and daily school work
research and preparation, and all individuals have
engaged in shopping behavior on the Internet,
with most of the individuals having actually made
purchases on the Internet. Almost all individuals
were registered on some sort of social network
site, with Facebook and Friendster being the
predominant sites.

Attitudes Towards Individual Privacy


and Spam
All participants in all the cultural groups indicated
that they have received spam e-mail, and most
participants believed that e-mail information was
obtained from various forms filled out either online
or on paper. One participant in the Indonesian
group also stated that she felt a gym she had joined
had sold her contact information to an advertising
company. Discussion in the American group
briefly turned to a recent incident in the university
in which personal information, including student
ID numbers, home telephone numbers, and home
addresses were included on a student contact
booklet and placed in the office in plain sight.
Individuals in the American group unanimously
expressed extreme discomfort over the matter and
had attempted to communicate their feelings with
the office staff. The office responded that it was
normal for contact information to be included
in a student contact booklet in order to facilitate
student socialization and the communication of
alumni in the future, and assured students that the
booklet would not be distributed to non-university
personnel. However, the American students still
felt the university should be held responsible for
failing to protect individual privacy. This issue
was not raised in the other cultural groups. When
questioned by moderators, most participants from
non-American groups felt it was reasonable for a
student contact list to include such information,
as classroom interactions did not allow for a full
range of social interaction. Participants in the
Taiwanese group also stated that it would have
been better if the university had tighter control
over distribution of the student contact list, but
likewise felt it was reasonable for such a contact
list to include personal contact information,
otherwise there was no way to know some of
their classmates better. It is interesting that while
the American group panned the concept of the
student contact booklet, most were enthusiastic
about the alumni group that existed on Facebook.

9

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

The non-American groups felt that while Facebook


was a novel idea, a contact booklet made more
sense as contact via phone or letters were more
intimate in situations that allowed for it.
Participants in the American group further
felt that personal privacy was not respected in
their host country, Taiwan. One individual related
incidents in which he was questioned regarding
his personal income and marital status. Other
participants also related similar experiences.
Most American participants expressed discomfort
regarding these questions; however, some felt
that this was part of the socialization process
and how Taiwanese people became familiar
with each other, commenting that it might be a
characteristic of Asian culture. This issue was
not raised by participants in the Vietnamese and
Indonesian groups, and when asked by moderators,
participants in both groups felt it was a good way
to know people and did not differ much from their
own culture. The Taiwanese group commented
that the ability to discuss about matters such as
income and marital status was a signal of trust
amongst people, and the ability to do so would
indicate a good and intimate relationship with
the other person, or a willingness to know the
other person on better terms. Furthermore, some
participants in the Taiwanese group felt that for
foreigners to be asked such questions indicated
that the Taiwanese that had asked these questions
were trying very hard to be a friendly host.

Government Regulation
The Vietnamese group indicated that they felt
no great need for government regulation on
online private information, as did the Indonesian
group. Participants in both groups stated that
while spam was a nuisance, they were content
with merely deleting the offending e-mail. The
Taiwanese group, while expressing opinions that
the government should do something about it,
also exhibited distrust in the telecommunications
industry and believed that it was the ISPs that were

9

selling e-mail lists to spammers. Participants in the


Taiwanese group also agreed that any attempts by
the government to regulate, would not be effective
as it was not an important enough matter for the
legislature to devote attention to. When asked,
no participants in the Vietnamese, Indonesian, or
Taiwanese groups were aware of specific laws that
protected their privacy, online or off-line.
Participants in the American group felt
that spam was problematic enough to warrant
government intervention. Compared with the
Vietnamese, Indonesian, and Taiwanese groups,
a more intense discussion was elicited from
the American group. Many participants in the
American group felt that spammers were breaking
laws and in serious violation of personal privacy.
Some expressed the wish to sue spammers or
parties who made personal information available.
Some Americans were aware of laws such as the
National DNC Registry in the United States and
had taken steps to register phone numbers when
in their home country. All individuals who were
aware of the DNC also expressed wishes that such
laws should be passed in Taiwan as well. Most
Americans were aware of laws that protected
their personal privacy in the form of search and
seizure laws, although comments were made
regarding the ambiguous status of such laws in
the post-9/11 world.

Summary of Findings
Although our focus group discussions were not in
depth, the brief discussions conducted indicated
that it is possible that more collectivistic Asian
cultures tended to be less sensitive to violations of
personal privacy in the same way individualistic
Western cultures may be. American participants
also tended to lean towards a more legislative
solution to the problem of personal privacy.
Vietnamese and Indonesian participants did not
actively express desires for personal privacy to
be legislated. However, Taiwanese participants
did express opinions that personal privacy

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

should be legislated, albeit most did not expect


such an outcome. It is possible that Taiwan,
due to being more westernized over the course
of the past 30 years, may also possess certain
reactions seen in Western cultures. However,
Taiwanese participants were generally less
sensitive on matters regarding personal privacy
than Americans. Despite this, attention should
be paid here to how different cultures define
privacy. As seen by American reactions towards
Taiwanese questions regarding personal status,
drastic differences may exist.

combat terrorism in the post-9/11 world. As


citizens claim a need to protection from unwanted
intrusion, arguments are also made that the same
protection afforded private citizens could also
shield terrorists from investigations. There is no
easy solution to this problem, and it remains to
be seen if governments are capable of restraining
themselves in the crusade against terrorism, or
failing that, if citizens are capable of restraining
their own governments.

conclusIon
future trends
At present, there is still no universal standard in
terms of Internet privacy protection, and no light
is yet visible at the end of the tunnel. The United
States alone will still be required to wrangle on
its own internal standard of privacy protection
and obtain a consensus from all 50 states. The
European Union, while possessing theoretically
sound protection of personal data, has yet to
vigorously test its own laws. Difference in opinion
between European and American standards on
privacy protection has also yet to settle. Asian
countries face an even greater challenge as their
populations become increasingly wired into the
Internet, while possessing no historically strong
legal or cultural basis for privacy protection.
However, as individual citizens become
increasingly aware of the pains of spam and
inadequate privacy protection, it is possible that
nations will eventually step up to the task of
legislating privacy protection in an acceptable
manner. A series of high profile murders of
spam kings in Russia at the time of this writing
would also indicate possible interactions with
the criminal world that will eventually force law
enforcement agencies to see spam as a serious
problem.
The trend of increasing awareness of privacy
protection is counterbalanced by the need to

Although the research of privacy has certainly


garnered much attention, it is important to
keep in mind that not all cultures perceptions
of privacy are the same, resulting in different
concerns regarding privacy. An American in
Taiwan might feel suspicious if asked to provide
his passport number by a community Web site,
while a Taiwanese in the United States might
be puzzled and alienated by the fierceness at
which people guard their private lives. While
Hofstedes (1997) cultural dimensions may give
us some glimpse into the values of each culture,
it does little to prepare us for how each culture
deals with personal privacy. Hsus (2006) study
indicated that different cultures do have different
perceptions towards privacy, in particular, what
kinds of information are deemed public enough
to share on various different types of Web sites.
The matter of Internet privacy only adds to the
mix, but is not in and of itself a new Internet
based phenomenon.
Investigations into the laws and cases that form
the basis of legal protection for privacy should also
be handled with care. While the United States is
touted as a self-governing system, does this hold
true when investigation is done at the state level?
As much of the United States legal system is based
upon precedence and previous court cases, would
this system not count as a very tight legal control?
Taiwan may have laws ensuring individual

9

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

freedoms and rights. However, in the case of


the 14 students whose rooms were searched and
computers seized in 2001, while many students
decried the trampling of individual privacy and
lack of due process, this aspect was given very
little consideration in the legal proceedings or
even long term attention in public press. Would
a researcher then argue Taiwan has plenty of
protection for individual privacy, as a collectivist
culture would per the study of Milberg et al.
(2000), or an apparent lack of such protection,
as witnessed by legal proceedings and lack of
attempt to legislate spam mail and phone calls?
Due to the various differences and backgrounds
that form privacy perceptions, great care and
sensitivity should be taken when conducting
privacy studies across cultures.

How Asian nations continue to evolve and view


online privacy is also worth reviewing. While the
Great Firewall of China currently maintains some
semblance of segregating Chinas network from
the rest of the world, online privacy is increasingly
a worldwide problem. Lacking even the strong
historical legal precedence available in the United
States, the Asian countries must develop their own
system on protecting privacy while at the same
time combating further encroachments.
However, these proposed research directions
merely scratch the surface of how to look at privacy
protection. It remains up to individual researchers
to determine how best to analyze the myriad of
issues in depth. At the current stage of research,
any addition to the field would contribute to the
overall understanding.

future reseArch dIrectIons

references

As continuing research is being done on the topic


of privacy protection, many issues remain to be
examined in depth. Of particular interest would be
how the United States faces this problem. While
many scholars choose to view the United States
as a single country, the fact that various states in
the United States have chosen to reject the national
ID program proposed by the federal government
on the grounds of privacy concerns indicates that
the United States is far from a singular entity.
This is further enhanced by the United States
being affected most directly by the events of 9/11.
What has been done to protection citizen privacy
is as important a topic as what has been done to
combat terrorism, perhaps even more so.
Of equal concern is how governing bodies
in the United States balance the needs of
commercial entities and privacy citizens. The
latest developments at the time of this writing
in October 2007 in California have the governor
of California vetoing a proposal for one of the
most stringent data protection acts in the United
States, citing concern for increasing costs for
compliance.

Bandler, R. (2000). Persuasion engineering.


Capitola, CA: Meta Publications.

9

Bellman, S., Johnson, J. J., Kobrin, S. J., &


Lohse, L. L. (2004). International differences in
information privacy concerns: A global survey
of consumers. The Information Society, 20, 313324.
BMG Canada Inc. v. John Doe, Volume Number
858, (2005).
CAN-SPAM Act of 2003, S.877, 108th Congress,
(2003).
Capitol v. Does 1-16, 07-485, WJ/LFG.
Capurro, R. (2005). Privacy. An intercultural
perspective. Ethics and Information Technology,
7, 37-47.
Dietrich, W. (2006). Are journalists the 21st
centurys buggy whip makers? Nieman Reports,
60(4), 31-33.
Dinev, T., Bellotto, M., Hart, P., Russo, V.,
Serra, I.., & Coluatti, C. (2006). Internet users

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

privacy concerns and beliefs about government


surveillance: An exploratory study of differences
between Italy and the United States. Journal of
Global Information Management, 14(4), 57-93.
Hanson, K. (2006). Should the boss be blogging?
Strategic Communication Management, 10(2),
6-7.
Herbert, N. (2006). Conquering spam in concert:
Anti-spam legislative efforts in the Asia Pacific
region. Law Technology, 39(2), 1-12.
Hofstede, G. (1997). Culture and organizations.
New York: McGraw Hill.
Horwedel, D. M. (2006). Blogging rights. Diverse
Issues in Higher Education, 23(2), 28-31.
Hsu, C.-W. (2006). Privacy concerns, privacy
practices and web site categories. Online
Information Review, 30(5), 569-586.
Internet Spyware Prevention Act of 2007, H.R.
1525, 110th Congress, 1st Session, (2007).
King, N. J., Pillay, S., Lasprogata, G. A. (Spring
2006). Workplace privacy and discrimination
issues related to genetic data: A comparative law
study of the European Union and the United States.
American Business Law Journal, 43. Retrieved
June 4, 2007, from https://ezproxy.royalroads.ca
McSweeney, B. (2002). Hofstedes model
of national cultural differences and their
consequences: A triumph of faith-A failure of
analysis. Human Relations,
55(1), 89-118.
Milberg, S. J., Smith, H. J., & Burke, S. J. (2000).
Information privacy: Corporate management
and national regulation. Organizational Science,
11(1), 35-57.
National Do Not Call Registry of 2003, H.R. 395,
108th Congress, (2003).
Recording Industry Association of America Inc. v.
Verizon Internet Services, Inc., Volume Number
03-7015, (2003).

Sitton, J. V. (2006). When the right to know and the


right to privacy collide. Information Management
Journal, 40(5), 76-80.
Smith, H. J., Milberg, S. J., & Burke, S. J. (1996).
Information privacy: Measuring individuals
concerns about organizational practices. MIS
Quarterly, 167-195.
Stone, E. F., Gardner, D. G., Gueutal, H. G., &
McClure, S. (1983). A field experiment comparing
information-privacy values, beliefs, and attitudes
across several types of organizations. Journal of
Applied Psychology, 68(3), 459-468.
Tavani, H. T., & Grodzinsky, F. S. (2002).
Cyberstalking, personal privacy, and moral
responsibility. Ethics and Information Technology,
4(2), 123-133.

AddItIonAl reAdIng
Harper, J. (2006). Identity crisis: How identification is overused and misunderstood. Washington,
D.C.: Cato Institute.
Milberg, S. J., Burke, S. J., Smith, H. J., & Kallman,
E. A. (1995). Values, personal information privacy,
and regulatory approaches. Communications of
the ACM, 38(12), 65-74.
REAL ID Act of 2005, H.R. 418, (2005).
Rotenberg, M. (2006). Real id, real trouble?
Communications of the ACM, 49(3), 128.
State of Arkansas. (86th General Assembly,
2007). To urge Congress and the United States
Department of Homeland Security to add critical
privacy and civil liberty safeguards to the REAL
ID Act of 2005 and to fully fund or suspend
implementation of the REAL ID Act, SCR 22.
Available at http://www.realnightmare.org/images/File/AR%20SCR22.pdf

9

Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan

State of California. (2007). AB 779. Available


at http://info.sen.ca.gov/pub/07-08/bill/asm/
ab_0751-0800/ab_779_cfa_20070522_154229_
asm_comm.html
State of Maine. (123rd Legislature, 2007). An
Act to prohibit Maine from participating in the
Federal REAL ID Act of 2005, LD 1138, item 2.
Available at http://janus.state.me.us/legis/LawMakerWeb/summary.asp?ID=280023863
State of Minnesota. (Legislative Session 85,
2007). HF 1438. Available at http://ros.leg.
mn/revisor/pages/search_status/status_detail.
php?b=House&f=HF1438&ssn=0&y=2007
State of Missouri. (Special Committee on General
Laws, 2007). HB 868. Available at http://www.
house.mo.gov/bills071/bills/hb868.htm
State of Montana. (60th Legislature, 2007).
HB 287. Available at http://data.opi.state.mt.us/
bills/2007/BillHtml/HB0287.htm

98

State of New Hampshire. (2007 Session), HB 685.


Available at http://www.gencourt.state.nh.us/legislation/2007/hb0685.html
State of Oklahoma. (51st Legislature, 1st sess.,
2007), SB 464. Available at http://webserver1.lsb.
state.ok.us/2007-08SB/SB464_int.rtf
State of Washington. (60th Legislature, 2007), SB
5087. Available at http://apps.leg.wa.gov/billinfo/
summary.aspx?bill=5087
Stewart, K. A., & Segars, A. H. (2002). An empirical examination of the concern for information privacy instrument. Information Systems
Research, 13(1), 36-49.
Trompenaars, F., & Hampden-Turner, C. (1994).
Riding the waves of culture. London: Nicholas
Brealey Publishing.

Section V

Policies, Techniques, and Laws


for Protection

00

Chapter XV

Biometric Controls and Privacy


Sean Lancaster
Miami University, USA
David C. Yen
Miami University, USA

AbstrAct
Biometrics is an application of technology to authenticate users identities through the measurement of
physiological or behavioral patterns. The verification system offers greater security to the use of passwords
or smart cards. Biometric characteristics cannot be lost or forgotten. As biometric characteristics are
concerned with the very makeup of who we are, there are also security, privacy, and ethical concerns in
their adoption.Fingerprint, iris, voice, hand geometry, face, and signature are all considered biometric
characteristics and used in the authentication process. Examples of everyday biometric applications
include thumbprint locks on laptop computers, fingerprint scanners to enter a locked door on a house,
and facial recognition scans for forensic use. While there are several examples of biometrics currently
in use, it is still an emerging technology. The purpose of this chapter is to provide a descriptive discussion of the current and future state of biometrics.

IntroductIon
The world is growing increasingly digital as
information systems and networks span the
globe. As individuals, customers, employees, and

employers, we can often connect to the Internet,


and to our information systems, from anytime
and anywhere. The freedom and flexibility that
technology provides is truly astounding when
compared to the limits placed on society just a
few years ago.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Biometric Controls and Privacy

Furthermore, data is recognized as a valuable


resource. The information and knowledge that is
created with this data is vital to business, trade, and
the increased convenience of common day-to-day
activities. We use this data to answer a variety of
questions. Companies collect and aggregate data
on their customers, products, and competitors.
Individuals save confidential files on their hard
and soft drives. How is this data secured? How
are the physical and digital systems that store this
data secured? How can we, as citizens of a digital
society, protect ourselves from the theft of this
private data? If you do not trust the information
you are working with, you will not trust the decisions made with that datas analysis.
Biometrics is becoming more and more common as an answer to those questions. Biometric
devices are a means of authenticating user identity
or identifying someone from a list of possible
matches. This chapter will cover why biometrics
is needed, how they are used, important issues in
their adoption, and future trends in their evolution.
Learning Objectives:

Learn the significance of privacy and the


risk of identity theft

Better understand the need for biometrics


in modern society
Comprehend the technical, economic, business, and ethical issues related to biometrics

the need for bIometrIcs


Imagine the most typical of e-commerce transactions, purchasing an item from an online Web site.
You select the merchandise and begin to check out
by filling in your personal information to complete
the order. Now, also imagine someone standing
over your shoulder watching and recording the
data that you submit. Even worse, once you are
finished, this person uses that data to impersonate
you, accessing and using your credit.
Fraud and identity theft are common examples
of cybercrime. The United States Federal Trade
Commission reported nearly 700,000 cases, with
losses totaling nearly $700 million, of identity
theft and online fraud during 2005 (Consumer
Fraud, 2006). The same report from the FTC listed
the most common methods consumer information was misused. A summary of that list can be
found in Figure 1.

Figure 1.
figure A: how victim's Information is misused
26%

25%
17%

15%

12%

10%

9%

6%

5%

governm ent
docum ents/benefits
fraud

em ploym ent related


fraud

checking/savings
account fraud

credit c ard fraud

0%

phone/utilities fraud

5%

attem pted identity


theft

18%

20%

other identity theft

25%

loan fraud

30%

0

Biometric Controls and Privacy

A key aspect of both fraud and identity theft is


the ability of the cybercriminal to impersonate the
victim while convincing others of the fraudulent
identity. This is especially true for systems that
require only passwords, user logins, or simple ID
swipe cards. For each of these, cybercriminals are
able to obtain the password, login, or card through
techniques that range from human engineering to
user carelessness to sophisticated software programs. Once the cybercriminal has obtained the
password or ID card, there is little to stop them
from impersonating the victim. The password
and card provide access to the preys physical
and digital systems and assets. Examples include
access to bank accounts, to credit, to government
services, and through physical entryways.
In this light, biometric security measures
would be particularly useful because their use
of unique human characteristics makes it far less
likely that a cybercriminal would be successful
impersonating the victim. While a cyberthief
may steal a password it becomes harder to steal
the pattern of veins in a retina and more difficult
to forge someone elses fingerprints.
Online privacy is not the only activity supporting the use of biometrics which can be used
for any information system that requires the
identity of a user to be confirmed. Biometrics
is an application of technology to authenticate
users identities through the measurement of
physiological or behavioral patterns. Fingerprint,
iris, voice, hand geometry, face, and signature
are all considered biometric characteristics and
used in the authentication process. Examples of
possible uses include access to buildings, airport
security, access to computers and laptops, and access to ATM and financial systems (Jain, Ross,
& Prabhakar, 2004).

overvIeW And evolutIon


While computer security is a more modern phenomenon due to its dependence on technology,

0

biometrics dates back centuries. The use of fingerprint by law enforcement agencies to identify
individuals suspected of committing a crime dates
back to the late 19th century (Jain, Ross, & Prabhakar, 2003). Going back to earlier centuries, artists
in East Asia were known for leaving fingerprints
on their work and traders in Egypt were commonly
identified by physical characteristics like height,
hair, and eye color. Citizens and societies have
commonly used unique human characteristics to
identify individuals. Biometric tools are not a new
addition to authenticating identity, but rather the
modern technology has allowed for new methods
of conducting biometrics.
The Theory of Human Identification believes
that there are three methods of associating data
with a particular human being (Clark, 1994).
Knowledge-based identification is when an individual possesses knowledge that only that person
would know. For example, passwords, birthdates,
and social security numbers represent forms of
knowledge-based identification. Token-based
identification recognizes a persons possession
of an item. Examples include a drivers license,
a passport, or an ID card. The third method is
biometric identification, using unique and personal physiological characteristics to recognize
an individuals identity.
Using human characteristics is seen as a reliable method of confirming ones identity or matching an unknown identity to a list of possibilities.
For example, a user may be granted access by
a biometric device, thus authenticating his/her
identity as a legitimate user. Biometrics can also
be used to search for human characteristics of an
unknown identity, seeking out possible matches.
An example of this second use of biometrics can
be found in detective work, matching the fingerprints of an unknown suspect to a database of
known offenders.
Knowledge and token-based identification are
the traditional methods used to authenticate user
identity. Confirming identity was based on some-

Biometric Controls and Privacy

thing the user had or was given like passwords,


security badges, clearance codes, and so forth. For
example, Web sites provide, or allow users to create
passwords. Government offices ask to see your
drivers license or ID card to verify you are who
you say you are. The downfall of these methods
is that they rely on items that can be lost, forgotten, stolen, or duplicated (Matyas & Riha, 2003).
Biometrics and its reliance on an individuals
characteristics do not face this pitfall. Biometric
characteristics are human traits that one is born
with. In addition, the ideal human traits used are
those that are unique. Indeed, some biometric
traits, like fingerprints and retinas are unique
even among siblings and identical twins.
As technology has progressed, so has the ability to find unique biometric patterns. Centuries
ago, the emphasis was on obvious physical traits
like hair and eye color. Over time, it evolved to
more subtle characteristics like fingerprints. Today, it is not uncommon to see the use of retina
or iris scans. In the future, it is likely that this
evolution will continue to evolve to use DNA and
other sophisticated human traits and patterns.
The two purposes of biometrics in authenticating user identity are verification and identification. Verification is answering the question am
I who I claim I am? It is confirming or denying
an individuals claimed identity. Verification is
commonly used to authenticate a users identity
before entering a secure room or before allowing
access to a secure system. Identification answers
the question or who am I? It establishes an
unknown individuals identity. Identification is
commonly used in law enforcement to recognize
and confirm suspects in an investigation.

eXAmples of bIometrIc
ApplIcAtIons
While the standard example of a biometric human characteristic is the fingerprint, there are
many others in use as well. Common biometric
characteristics are profiled. Figure 2 compares
the biometric characteristics on ease of use and
accuracy (Liu & Silverman, 2001).
Fingerprint recognition focuses on the ridges
on the skin of the fingers. An existing fingerprint
can be queried across a database of prints to find
a match to the users identity. A small number
of people are unable to use fingerprint recognition systems due to excessive wear, dryness, or
exposure to corrosive chemicals (Rosenzweig,
Kochems, & Schwartz, 2004). Fingerprint recognition is presently the most common form of
biometric device in use today and can be found
in a variety of devices from simple USB plug-ins
to wireless phone and PDA protection to entry
door locks.
Facial recognition is the ability to match facial
patterns with those in a database to determine
identity. Facial characteristics measured can include the location and shape of eyes, eyebrows,
nose, and lips. Due to the complexity of the patterns, facial recognition is regarded as being the
most expensive of biometric options. However,
biometric devices using this facial detection are
being used by industries to scan large areas to
recognize suspected persons. Examples include
airport systems screening for known terrorists and
casinos screening against prohibited players.
Hand recognition measures the shape and
distance of key features like finger length, finger

Figure 2. Comparison of common biometrics


fingerprint

facial
recognition

hand
recognition

retina

iris

Ease of use

high

medium

high

low

medium

Accuracy

high

high

high

very
high

very
high

0

Biometric Controls and Privacy

location, and finger size. Patterns can be found


using 96 different measurements of the hand.
Hand recognition has been used by a variety of
industries for over 30 years, and is commonly
used to protect entry points (Rosenzweig et al.,
2004). For example, U.S. nuclear power plants use
hand recognition devices to verify legitimate use.
Hand recognition is also used by businesses for
time clocks for hourly workers. In fact, simple
hand punch systems can be found at office supply stores.
Retina scanning examines the pattern of the
veins found just below the surface of the retina.
Additionally, iris scanning can also be used to
match the patterns found in the colored part
of the eye, surrounded by the white of the eye.
Irises have 266 distinctive characteristics that
can be used in biometric patterns (Jain, Hong, &
Pankanti, 2000). Retina and iris scanning can be
harder to use as the performance of these scanners are negatively impacted by some contacts
and corrective lenses. Additionally, eyeglasses
may make the scanning process uncomfortable.
While both of these biometric measures are not as
common as those profiled earlier, their accuracy
is very high and thus their adoption is likely to
grow in the future.
In addition to these physiological characteristics, biometrics also includes more behavioral
measurements. Examples in this category include
signature verification and keystroke patterns. Signature verification is how a person signs their name
and unique characteristics may include how hard
an individual presses on certain characters, how
quickly the letters were created, and the overall

appearance of the signature. Keystroke patterns


are an individuals tendencies and patterns when
typing (Harris & Yen, 2002).

performAnce Issues
Important terminology in the field of biometrics
includes false rejection rates (FRR) and false acceptance rates (FAR). It must be noted that biometric devices look to match the users characteristics
against a database of patterns of characteristics.
It is unlikely, due to environmental and physical
factors, that the database will return an exact
match. Therefore, biometric measures look to
find a probable, or likely match, leaving open the
possibility of false positives and negatives. Figure
3 summarizes the activities of a typical biometric
device. These activities do assume that the user
has been enrolled into the system. Enrollment is
simply the process of registering new users into
the biometric system.
The first step is the actual use of the biometric device by the user. Once the user is scanned,
the system will create a biometric template. The
template is simply a mathematical representation
of the users characteristics. The third step is to
search for potential matches to the biometric
template. The system looks for matches compared to users who have been enrolled into the
system. These results are analyzed, and scored,
for probability of identity. Finally, the system will
return the results of its processes, and positively
or negatively acknowledge the identity of the user
(Liu & Silverman, 2001).

Figure 3. Five steps of biometrics


Step

0

Activity

user scan

template creation

query database

analyze results

return action

Description

user scans
hand,
finger, or
eye using
biometric
reader

user scan is turned


into a biometric
template

biometric database
is searched for
similar results

algorithms are
used to determine
possible positive
matches

system,
negatively
or positively
acknowledges
user identity

Biometric Controls and Privacy

False rejection rates, false negatives, measure


the probability that an acceptable measurement is
rejected. As an example, FRR is when a legitimate
user is rejected, or not identified as legitimate, by
the security device. This rate is also known as a
type I error. Successful biometric devices will have
a low FRR and thus have few type I errors.
False acceptance rates, false positives, measure
the likelihood that an invalid identity is confused
with a legitimate user. In this case, the user is
not genuine but is allowed access by the security
device. This rate is also known as a type II error.
Type II errors are significant because they open
the possibility of security breach.

key Issues
There are a number of important considerations
that must be considered when analyzing biometrics in use today. This section will profile the
technological, economic, business, and legal/ethical issues.

technological
Before biometric measures can be properly implemented, users must be aware of the technologies
needed to support. Every biometric device will
need to store a database of positive patterns. For
example, a retina scanner will need to be linked
to a database of acceptable retina scans, used
to confirm or reject the identities of users. The
biometric industry has been fortunate to create
several open standards that support the interoperability of devices. BioAPI and the common
biometric exchange file format are examples of
these standards (Liu & Silverman, 2001).
BioAPI is a joint effort of more than 120 different organizations. The BioAPI group created
a joint, open standard application programming
interface (API) in 2000. This standard allowed
biometric software applications to communicate
with different biometric technologies. BioAPI is

widely used by the majority of industry vendors.


In the future, the BioAPI Consortium would
like to make it possible that biometric devices
become plug and play. For more information visit
the BioAPI Consortium Web site at http://www.
bioapi.org/.
The common biometric exchange file format
(CBEFF) was coordinated in 1999 by a combination of end users and biometric organizations
including several future members of the BioAPI
consortium. CBEFF created common, agreed
upon, data formats for biometric applications.
It allows for biometric information to be passed
between components and the system. The data
described by CBEFF includes security and encryption, processing information, and biometric
data. It can be used in conjunction with BioAPI to
create seamless integration between devices.
Additionally, it must be recognized that this
electronic record storage is no different from
other electronic records. As any computer device
that uses binary to operate, someone looking to
breach a biometric security measure need only
to recreate the series of 1s and 0s that make a
positive pattern in the database. While popular
media portrays biometrics as being broken into
by recreating fingerprints and other human
characteristics, a more likely method of breaking through a biometric security device is to
find the acceptable binary code. This electronic
record storage must be properly secured, as any
tampering would open the possibility of type II
errors. For this reason, most biometric systems
encrypt the templates that are created. Without
rigid security measures, biometrics lose their
value as a protective device.
As with all procedures that store binary code,
there is a need for adequate storage space. The
biometric databases will need to be efficient to
query to find probable matches to user requests.
As an example, users will find biometric entry
systems slow and frustrating if the system takes
several minutes to respond to their initial scan.
Aside from efficient storage, there must also be

0

Biometric Controls and Privacy

adequate network linkage between the biometric


scanner and database. Again, the design should
be to encourage the use of the biometric device
through proper planning of technical considerations.

economic
One of the largest drawbacks of biometrics to this
point is the physical cost associated with their
adoption. If an organization implements a biometric security system to guard their entryways,
the system will be needed at all accessible points
to be reliable. In addition to the physical costs,
firms will need to acquire the expertise needed to
setup, maintain, and audit these systems. Finally,
compatibility issues with other information systems will need to be explored in order to prevent
against mismatched programs and reports. A
comprehensive biometric solution can range in
cost from a few hundred dollars to the hundreds
of thousands depending on the scope and level
of complexity needed.
There are more indirect costs to biometric
systems as well. Users may be initially resistant
to change, especially if the need for the new system is not properly communicated. Users may
also be hesitant to provide data for the system, as
they may question the need to scan personal and
unique human characteristics. A perfect example
of this reluctance may be found in fingerprint
scanning systems. Due to their use in criminal
justice applications, some may be suspicious of
a biometric database.
Still, there are numerous benefits to biometric
systems. Secure points of entry into physical and
digital assets. Rigorous authentication and identity
confirmation can be an asset for those looking to
better protect their systems. Biometric systems
do away with the need to remember passwords
and ID cards. While there is a cost associated
with these systems, that cost may be more easily taken to prevent a cybercrime attack than in
response to one.

0

business
There are a number of business considerations
for biometrics. From an end user standpoint,
employees may appreciate the benefits of biometric systems. First, these devices offer increased
security, which alone may make them worthwhile
for highly secretive firms and industries. Second,
biometric measures do offer greater convenience
because they cannot be forgotten or left behind.
Users will not have to remember a number of
passwords or worry about bringing the ID card.
However, users must be trained on the proper use
of the new systems. This is especially true as the
positioning of the eye for a retina or iris scanning
device is critical to the operation.
From an operational standpoint, firms will
need to evaluate vendors in this industry to
find a successful fit with their individual needs.
Depending on the technical skill and savvy, the
firm may be able to manage the system in house
or be reliant on outside suppliers and consultants.
Any successful supplier relationship will require
adequate planning and communication.
It must be noted that biometrics will be only
one component of a comprehensive security plan.
Organizations will use biometrics in conjunction
with other security devices and checkpoints.
Businesses must have multiple layers of security.
For example, a firm may choose to increase the
security of an entryway by using a biometric fingerprint scanner in conjunction with a swipeable
ID card and a password keypad. Furthermore, the
firms methods of guarding its entryways must
be integrated with its overall security strategy.
The sharing of data and intelligence about who is
accessing these systems and what they are using
the systems for is invaluable. Likewise, it would
be smart to share data on when someone was not
granted access to a system and compared to other
company systems. Viewing security in a holistic
manner can help identify attempted breaches
before a crime can actually occur.

Biometric Controls and Privacy

In addition, multiple instances of biometric


measures may be combined to form a multimodal
biometric system. For example, the use of a fingerprint scanner with an iris reader would provide
two biometric security measures for a greater
level of security assurance. Again, a multimodal
system must be recognized as a just single layer
of security.
One final business consideration is the maintenance of the biometric system. The scanning
devices will need to be kept clean and hygienic.
The back end hardware must also be kept in good
operation. Data cleanup will be valuable to ensure
that the system relies on accurate enrollment and
template records.

legal/ethical
Biometrics, by their very nature, touch on a host
of legal and ethical issue. Any device that requires
the capture, storage, and analysis of unique human
characteristics must be managed with discretion.
Users must be made aware of the need for biometrics and provide their consent for use.
Due to the sensitive nature of these devices,
they would be impacted by a number of privacy
laws that have been created in the United States.
For example, a medical institution using biometrics would need to verify that their processes
comply with the Health Insurance Portability and
Accountability Act (HIPAA). Furthermore, any
loss of data or breach in the system would need
to recognized and dealt with immediately (Down
& Sands, 2004).
An additional legal concern for biometric
systems is the Identity Theft and Assumption
Deterrence Act, passed by the United States
Congress in1998. This law protects against the
transfer of material that can be used for identity
theft. The act explicitly includes biometric data in
its list of materials that can be used as a means
of identification (Hemphill, 2001).

The probability of type I and type II errors


must be examined before biometrics can be
implemented. Confirmation is needed that the
biometric device will protect, to the needed level,
against false positives and negatives. In this light,
biometrics must also be examined for the impact
on civil liberties. There will be, and are, concerns
that devices that monitor unique human characteristics will be used for more than verifying and
recognizing identity. Biometric solutions should
be implemented overtly, with the consent of the
user base, to avoid being confused with a Big
Brother environment. The social impact of these
systems will need to be continuously monitored
(Johnson, 2004).
Another ethical issue that should be considered
is the ability of users to remove themselves from
the system. How long are human traits kept in the
biometric database? For example, if an employee
retires, should his/her biometric data be kept in,
or removed from, the database?
Finally, the use of biometrics must be carefully
weighed due to the permanence of the human characteristics that they use. Unlike passwords, there
is no weak or strong biometric. The characteristics
that these systems use are permanent. There is a
simple remedy for a user losing their password;
simply issue or create a new password. This
procedure can also be followed if the password is
compromised. However, if a biometric system is
compromised, their permanence makes it hard to
recover. You cannot reissue a users fingerprint or
iris. Even more worrisome is what risks that user
faces once their biometrics are no longer unique
and personal (Schneier, 1999).

future of bIometrIcs
Biometrics will continue to evolve. In the future,
possible biometric traits to be incorporated into
pattern matching systems include body odor, ear
shape, facial thermography, and DNA matching
(Rosenzweig et al., 2004). It is expected that

0

Biometric Controls and Privacy

the cost of implementation and use of biometric


devices will continue to drop, making it economically feasible for their adoption in new environments and industries.
The wireless and financial industries both
show great potential for future biometric use. As
wireless devices, cell phones, PDAs, and laptops,
become increasingly more common, their security
will grow in importance. In addition, the wireless
devices are being used for more business critical
applications and need to have appropriate levels
of protection. Finance and credit institutions have
always placed a high degree of importance on security and user identity authentication. Biometrics
offer great potential when used with a credit/debit
card to very the cardholders identity.
It is also likely that the underlying algorithms
that used to match characteristics with the biometric database will continue to improve. As the
effectiveness of the queries grow, the probability
of false positives and false negatives will decrease
making these systems even more reliable.
There is no question biometrics will continue to
be used in law enforcement and criminal justice.
The ability to match an unknown suspects identity
can help to find lawbreakers and protect society.
In addition, with concern to terrorism, states and
government agencies will use biometrics to identify and protect against a list of known offenders
(Zureik, 2004). For example, in the United States,
the Department of State began issuing passports
that included biometric passports in 2006.
Perhaps the biggest question for biometrics is
if they will be accepted on a large scale. While
users certainly understand and respect their use in
high security environments, it remains to be seen
if society as a whole will be willing to use them
for everyday tasks. The decrease in cost and the
increase in effectiveness make these systems potential solutions for a host of daily activities, even
as a possible replacement for everyday passwords.
However, before biometrics can reach that point,
citizens will need to be assured that there is little
risk to their privacy and civil liberties.

08

conclusIon
Today, personal and digital privacy have become
mainstream issues as a result of the growth of
cybercriminal activity and the severity of identity
theft. While individuals must take greater precaution with their own private data, biometrics
can assist in securing against fraudulent access
to information systems and the impersonation
of identity.
Biometric devices offer another layer of
security for physical and digital systems and assets. Their adoption, given proper planning, can
be a valuable method to authenticate users and
identify unknown persons. Modern technology
is improving the techniques biometric systems
use and lowering their cost of adoption.
The designers and creators of biometric
systems must be wary of the legal and ethical
questions that their devices are likely to create.
Indeed, biometric devices affect privacy on two
fronts. First, they can help protect user privacy
by legitimately authenticating identity. Their use
can protect against cybercrime and identity theft.
Second, due to their use and storage of personal
and unique characteristics, biometrics opens a host
of questions on their impact on civil liberties.
However, with adequate communication, users
are likely to appreciate systems that allow them
the ease of use and convenience that biometric
systems offer. Biometrics offer increased security
levels when used properly and in conjunction with
a well thought security plan. For this reason, we
should expect their use to continue to grow in
the future.

references
Clark, R. (1994). Human identification in information dystems: Management challenges and
public policy issues. Information Technology and
People, 7(4), 6-37.

Biometric Controls and Privacy

Consumer fraud and identity theft complaint data,


January-December 2005. (2006). Retrieved June
1, 2007, from http://www.ftc.gov/bcp/edu/microsites/idtheft/downloads/clearinghouse_2005.pdf
Down, M., & Sands, R. (2004). Biometrics: An
overview of the technology, challenges andcontrol considerations. Information Systems Control
Journal, 4, 53.
Harris, A. J., & Yen, D. C. (2002). Biometric
authentication: Assuring access to information.
Information Management and Computer Security,
10(1), 12-19.
Hemphill, T. (2001). Identity theft: A cost of
business? Business and Society Review, 106(1),
51-63.

Matyas, V., Jr., & Riha, Z. (2003). Toward reliable


user authentication through biometircs. Security
& Privacy Magazine, IEEE, 1(3), 45-49.
Rosenzweig, P., Kochems, A., & Schwartz, A.
(2004). Biometric technologies: Security, legal,
and policy implications. NASSP Legal Memorandum, 12, 1-10.
Schneier, B. (1999). Biometrics: Uses and abuses.
Communications of the ACM, 42(8), 136.
Zureik, E. (2004) Governance, security and
technology: The case of biometrics. Studies in
Political Economy, 73, 113.

AddItIonAl reAdIng

Jain, A. K., Hong, L., & Pankanti, S. (2000).


Biometric identification. Communications of the
ACM, 43(2), 90-98.

Ashbourn, J. (2000) Biometrics: Advanced identity verification. Springer-Verlag.

Jain, A. K., Ross, A., & Prabhakar, S. (2003).


Biometric recognition: Security and privacy
concerns. Security & Privacy Magazine, IEEE,
1(2), 33-42.

Kittler, J., & Maltoni, D. (2007). Introduction


to the special issue on biometrics: Progress and
directions. IEEE Trans. Pattern Anal. Mach.
Intell., 29(4), 513-516.

Jain, A. K., Ross, A., & Prabhakar, S. (2004). An


introduction to biometric recognition. Circuits and
systems for video technology. IEEE Transactions,
14(1), 4-20.

Pons, A. P. (2006). Biometric marketing: targeting the online consumer. Communicatiojns of the
ACM, 49(8), 60-66.

Johnson, M. L. (2004). Biometrics and the threat


to civil liberties. Computer, 37(4), 90- 92.

Ratha, N. K., Connell, J. H., & Bolle, R. M. (2003).


Biometrics break-ins and band-aids. Pattern
Recognition Letters, 24(13), 2105-2113.

Liu, S., & Silverman, M. (2001). A practical guide


to biometric security technology. IT Professional,
IEEE, 3(1), 27-32.

09

0

Chapter XVI

Government Stewardship of
Online Information:
FOIA Requirements and Other
Considerations
G. Scott Erickson
Ithaca College, USA

AbstrAct
This chapter focuses on the specific issue of the federal Freedom of Information Act and associated
state and local freedom of information laws. While full of good intentions regarding openness in government, the statutes have increasingly been applied to circumstances when individuals or organizations
seek government records for legal or business purposes. As such, confidential business information and
private personal information are both vulnerable when data are in government hands. Given the maze
of exemptions and agency interpretations regarding freedom of information requests, the circumstances
are both highly variable and unpredictable. Better understanding of the statutes and their interpretations
will help individuals and organizations make better decisions regarding data interactions with various
levels of government.

IntroductIon
In an age with ever increasing amounts of personal data held in commercial and government
databases, many individuals view the government

as the champion of their personal right to privacy.


While this is true to some extent, it is also true that
various levels of government are great generators,
collectors, and suppliers of personal data. Some
is published as a matter of course, some is sold.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Government Stewardship of Online Information

But a substantial amount of information flows out


of the federal, state, and local governments every
day as a result of freedom of information requests.
Indeed, it is quite possible that information you
did not even realize the government holds is being turned over to a private concern right now,
whether the government agency involved really
wants to do so or not.
In this chapter, we will look at general trends
in data collection and processing and how they
relate to statute, execution, and court precedent
concerning the federal Freedom of Information
Act and related state laws. We will cover the laws
in some detail, including common exemptions
from requests and changes in the law over time.
Agency practices related to the act and key court
decisions will also be covered.
With that background, we will consider the
status and vulnerability of confidential business
information passed on to governments. Because
such information increasingly includes personal
details about individuals (chiefly customers),
government-held business information impacts
our privacy discussion. We will then move on to
consider the status and vulnerability of government databases concerning individual citizens.
The main point of the chapter is that matters
remain in flux in this area. Government openness
is fairly well-defined but privacy rights are not
entirely clear, especially related to freedom of
information requests. Regardless of which way
the balance tips (openness vs. privacy), more
certainty will help government, organizations,
and individuals better plan how and when to share
their own information resources.

bAckgound
freedom of Information
The Freedom of Information Act (FOIA) (Apfelroth, 2006; Uhl, 2003; Halstuk & Davis, 2002;
Perritt, 1998; Perritt, 1995) was enacted in the U.S.

in 1966 with the aim of opening up government


to more public review, thus increasing accountability. The federal law applies to U.S. executive
agencies (not judicial or legislative bodies), while
all 50 states and the District of Columbia have
also enacted freedom of information laws (FOIL),
generally in line with the FOIA, covering their
state and local government agencies. We will
discuss these separately, where appropriate.
The FOIA requires governmental agencies to
proactively disclose or publish records as a matter
of course. It also requires them to answer requests
for disclosure of other, unpublished records. The
law applies only to existing records (agencies
do not need to create new ones to respond to a
request) and requests can be made by any individual or group, theoretically without explaining
why they want or need the records. There are nine
exemptions in the act that agencies can use, but
are not required to use, to deny an FOIA request.
These exemptions cover requests for information
relating to:

National security or foreign policy


Internal personnel policies and practices
Personal privacy when exempted by law
Trade secrets or other confidential business
information
Deliberative privilege
Personnel or medical files
Law enforcement
Financial institution assessments
Geological and geophysical information and
data

A tenth exemption was added concerning


critical infrastructure in the wake of 9/11 (Uhl,
2003).
One of the key things the FOIA did was
change the presumption in terms of turning over
information. Where requesters previously had to
make the case in terms of needing the records, the
burden shifted to the agencies with the passage of
this act. The government entities were expected



Government Stewardship of Online Information

to turn over requested information unless they


could justify not doing so because of one of the
exemptions. So the presumption was that records
would be published or turned over when requested.
Only in rare circumstances would agencies resort
to one of the exemptions. And, again, note that
the exemptions permitted the agencies to decline
to reveal information, nothing requires them to
decline. When exemptions might apply, it is up to
the agencies whether to resort to them or not.
The act has amended over the years, but the
change most relevant to our discussion is found
in the electronic FOIA amendments (EFOIA) of
1996 (Halstuk & Davis, 2002; Perritt, 1998; Leahy,
1998). The EFOIA made explicit what was already
general practice, namely that records specified
in the original FOIA included electronic media.
Further, any records should be made available
to requesters in whatever format they desired
(again, including electronic). Finally, the EFOIA
reiterated that the balance between openness and
privacy, where government information was concerned, should fall firmly on the side of openness.
Indeed, an important court decision, Reporters
Committee, we will discuss later in this chapter,
had raised some issues as to what agency records
were subject to FOIA, suggesting that some might
not be because of privacy concerns. As Leahy
(1998), one of the EFOIAs senate supporters put it,
FOIA is no longer limited to a core purpose of
making agency records and information available
to the public only in cases where such material
would shed light on the activities and operations
of government. In short, all information is subject
to FOIA requests, not just certain information.
Congress reiterated the presumption that records
should be released, not held secret, with passage
of the EFOIA.
As this discussion suggests, and as we all
know, the real key to the impact of legislation is
found not only in the statue itself but also in its
practice and its interpretation by the courts. The
intention of the statute was to open up government
to the scrutiny of concerned citizens and the press.



Interest proved to be much more extensive than


anticipated. According to one source, 1,965,919
FOI requests were made to federal agencies in
fiscal 1999 (Uhl, 2003). This figure does not
include, of course, state and local requests. More
detailed data from the FDA in 2004 showed 18,540
requests costing the agency $12.8 million to process. Sixty-seven point six percent of the requests
were fully granted and 31.9% were not granted
because no records existed or the requests were
withdrawn. Forty-five requests were only partially
granted (some information withheld) and 39 were
denied, chiefly due to exemptions pertaining to
confidential commercial information or ongoing
investigations (Lurie & Zieve, 2006). As might be
deduced from the data, the FOIA has become a
source of commercial insights and litigation support. Indeed, the vast majority of requests under
the Act seek no information about the activities
of the government, but rather information about
business competitors, opposing parties in litigation, and the activities of other nongovernmental
entities (OReilly, 1982).
Some general aspects of agency behavior
and court decisions reflect this explosion in the
number of requests and the cost of fulfilling them,
as well as growing concerns about proprietary
commercial information and individual privacy.
We will discuss all in more detail, but it is important to recognize broad trends. Initially, as noted
previously, the EFOIA in 1996 not only firmed
up access rights for electronic records but also
reiterated the presumption on the side of openness.
This has generally been true of legislation, which
seems to regularly reassert congress intention to
push openness, even at the expense of sometimes
revealing proprietary or private information.
Practice at agencies varies widely (as it does
between states and localities), but the trends in
agency practice have been more in the direction
of privacy, not openness, and have been further
reinforced by court decisions. On one hand,
there has long been a distinction and protection
of confidential business information (Kilgore,

Government Stewardship of Online Information

2004). As long as an agency recognizes that the


information provided is confidential (e.g., documentation of a drug production process filed with
the FDA), this type of record clearly falls under
Exemption 4 of the FOIA. The onus is on firms
to identify and ask for such protection, but it is
often available and has been supported, as we will
discuss later. So even though FOIA requests are
used for competitive intelligence purposes, agencies are not always giving away the store.
Similarly, privacy concerns have resulted in
some limitations in practice. Once again, there are
specific exemptions concerning privacy though
some are, on their face, more relevant to government employees than to the general public. In spite
of that aspect, there is a general attitude toward
balancing openness and privacy, even though the
laws presumption tends to favor openness. But
a number of agencies have developed a standard
policy toward FOIA requests, delivering records
of a general or statistical nature while seeking to
exempt requests focusing on individually identifiable information. This approach was legitimized
in the case referred to as Reporters Committee
in the literature (Halstock & Davis, 2002; Cate,
Fields, & McBane, 1994). In Reporters, the U.S.
Supreme Court essentially ruled that personal
privacy should not be violated by release under
FOIA unless the information had to deal with
the central purpose of the agencys operations.
The decision goes back to the intentions of the
original legislationto monitor what the agency
is doingwhile limiting access to data it might
have accumulated which is tangential to its actual
operations. A wave of lower court decisions have
followed and supported Reporters Committee. So
although openness advocates frown upon it, the
case is now well-established as a precedent.
More recent activity includes a trend toward
more national security exemption denials of FOIA
requests in the wake of 9/11 (especially those
made to the Department of Homeland Security)
and a potentially far-reaching U.S. Supreme
Court decision in National Archives and Records

Administration v. Favish in 2004. In a number of


ways, it is probably too early to discern the full
impact of either event but they are of import and
will enter into our later discussion.
In the former case, the increasing scrutiny
of FOIA requests is a matter of Bush administration policy, changes in agency behavior, and
aggressive support from the Department of Justice when challenged (Uhl, 2003). As memory
of 9/11 fades and/or as administrations change
(Uhl specifically contrasts the very different
tendencies toward requests between the Clinton
and Bush administrations), agencies attitudes
and behaviors may very well change, too. And,
of course, the courts will likely have something
to say as well when challenges to denials move
through the system.
In the latter case, Favish was an attorney who
sought death scene and autopsy photos under
FOIA after Vince Fosters 1993 suicide. The
top courts decision in favor of denial firmly
established privacy rights for survivors and the
familys privacy interests outweighed the public
interest in disclosure (Halstuk, 2005). But probably even more importantly for our purposes, the
court created a sufficient reason test, requiring
the FOIA requester to demonstrate a public interest that would be aided by disclosure. Without a
sufficient reason to overcome a presumption of
legitimacy that agencies are acting appropriately
(Halstuk, 2005), this has the potential to be an
extremely important change in philosophy if the
presumption effectively passes from the agency to
the requester. For 40 years, the burden has been on
the agency to prove why an FOIA request should
not be honored. Conceivably, this case transfers
that burden to the submitter, to establish why
the request should be. But only decided in 2004,
it is far too early to try to assess the full impact
of the case and whether it will hold up as a new
precedent.
The federal FOIA and state/local FOILs
have always posed an issue of balancing openness in government with privacy rights, though



Government Stewardship of Online Information

those rights were often to do with the privacy of


government employees, criminals, or corporations rather than the general public. Though a
generalization, there has been a tension between
legislatures pushing openness and agencies and
courts moving more toward privacy. Freedom of
information is a concept and policy still in flux
but it has the potential to have substantial implications for businesses and individuals as we move
ever further into the digital age.

privacy
Although a lot of discussion about privacy rights
takes place in the U.S., the actual legal standing of the concept is not as straightforward as it
might seem. Indeed, in the previous section, we
used the term rather frequently, but extensions
to individuals who might have information concerning them within government databases can
be problematic.
The right to privacy generally derives from
the fourth amendment, [t]he right of the people
to be secure in their persons, houses, papers,
and effects against unreasonable searches and
seizures, shall not be violated. As should be
clear from the context, this text tends to refer to
government rather than private individuals or
organizations (Brenner & Clark, 2006). So the
U.S. constitution deals with privacy from the
government, but not necessarily from Google. The
right to privacy as we generally perceive it comes
more from court rulings over time (Dalal, 2006),
particularly Olmstead wherein Brandeis coined
the term the right to be left alone. A later case,
Katz, formalized the concept, establishing the
criteria or whether an individual expected privacy
and whether the expectation was reasonable; if
so, their privacy should not be invaded (Brenner,
2005). This decision also extended Olmstead in
that it made clear privacy applied to people, not
just places (the home).
How does privacy apply to records, particularly electronic ones? There are existing laws,



but they tend to be a hodge-podge, covering


only very specific circumstances. At the federal
level, statutes protect privacy in areas including
electronic communications, health care records,
financial data, video rental data, Internet data
regarding children, and cable subscriber data
(Solove, 2005; Levary, Thompson, Kot, Brothers,
2005; Eden, 2005). There has been some action
on data gathered through radio frequency identification (RFID) technology, in particular, that
we will discuss in more detail later. Even there,
however, government action has been sparse with
legislation passed in only a single state (Wisconsin) while potential regulators at the federal level,
such as the Federal Trade Commission, have
recommended only voluntary standards related
to privacy (Hildner, 2006).
In summary, although right to privacy is
bandied about as if the term were a carefully defined legal concept enshrined in the constitution,
its actual status and application is fairly loose. It
has been defined more in practice by organizations
and the government. As such, any protections are
less entrenched than they might seem to be.

mAIn thrust
contemporary databases and the
Internet
In an age of an ever-increasing number of electronic and online databases that contain an everincreasing depth of information, the issue of privacy has become of considerable concern to both
individuals and organizations. The past 10 years
have seen an explosion of spending in information
systems that tie parts of firms, collaborator firms,
and customers together. These extended networks
of data exchange are now a critical part of doing business, transferring information between
individuals and organizations on an ongoing and
increasing level on a daily basis.

Government Stewardship of Online Information

Enterprise resource planning, supply chain


management, and customer relationship management systems have all multiplied over the past
decade (Rothberg & Erickson, 2005). All are
typically done over the Web, of course, today. But
the basic point is that a great deal of information
is constantly flowing around the Internet and
firms are using the information to build massive, valuable, proprietary databases concerning
operations, supply chain partners, and customers
(both consumers and organizations).
The most recent extensions of these trends have
been data collection methods on the Internet and
through radio frequency identification technology, both serving to firmly establish the point to
be made concerning the nature and ubiquity of
contemporary corporate databases. The Internet
has long been seen as a superior way to gather
information on how consumers shop: what attracts
them to a site, what options they consider, what
communications and promotions move them (or
not), and other such data. With cookies and other
data-gathering mechanisms, such information
could be collected on an individual basis for a
vast number of consumers, with the vast majority
unaware of the process.
Similarly, RFID is not only a way to more efficiently manage supply chains and operations, but
also as a mechanism to better record what goes on
with consumers in retail environments and in service operations. RFID tags act in some ways like
bar codes, but can be linked to much more information and can report data through long-distance
readers rather than the close-at-hand scanners one
sees at the supermarket. When joined with loyalty
card or credit card data, RFID systems can gather
extensive individual information on purchases,
usage, response to communications, and so forth.
One of the most high-profile applications of RFID,
for example, is in casinos, where gaming choices,
betting patterns, cash levels, responses to price
levels, promotional choices (free room, free show,
betting allowances, etc.) and other such individual
data can be gathered and storedlater used to

craft appeals to specific customers to get them


back into the casino more quickly.
The main point is that extensive commercial
databases are being constructed by organizations. And with good reason, as better knowledge
concerning individuals allows better-crafted offerings that better address needs and wants. But
modern technology, as in the case of the Internet
or RFID, allows this data to be collected without
individuals necessarily knowing it. This raises
immediate issues for groups concerned with
privacy. With RFID, for example, one interest
group objects to the technology and its use with
consumer products because:

There is information disclosure, without


consumer control;
There is tracking, point-to-point, point-toindividual, or individual-to-individual;
Collection is essentially invisible (readers
are unobtrusive)
Data are joined, without consumer awareness; and
Tags and readers are ubiquitous and distributed (Mulligan 2004).

All of this becomes important in that governments, at all levels, are also capable of building
these types of databases (indeed, one of the initial adopters of RFID technology was the U.S.
Department of Defense, mandating tagging for
all major suppliers). Further, corporations submit
vast amounts of information from their proprietary
databases in regulatory filings, including all sorts
of operational and marketing data down to the
individual consumer. The federal government is
the largest single producer and collector of information in the United States (Leahy, 1998). And,
finally, governments purchase marketing and other
databases, combining them with other, existing
databases in some cases. We will discuss some
specific examples in the following sections.
The fact that all of this activity is occurring in
an age of ever-increasing transfer and publication



Government Stewardship of Online Information

of information over the Internet just exacerbates


the problem. Much of the discussion has implicitly
referred to how online activity affects the FOI
issue. But it does not hurt to explicitly note the
challenges in one place.
As just noted, the Internet allows easy and
unobtrusive data collection on individuals. It
also allows easy collection and categorization of
such data, as it is collected on a digital system
and fed instantly into digital storage. The Internet
provides a ready mechanism to instantly transfer
copious amounts of data from one entity to another,
whether that is business-to-business, business-togovernment, or government-to-business. Consequently, joining databases is also much simpler
given internet technology and connections. And
data mining capabilities also make a difference, as
organizations can easily review and analyze data
from a variety of sources. Essentially, the Internet
has sharply increased the ability of firms to access and use databases, creating greater interest
in what information various levels of government
might hold. Hence, there is the potential for ever
more FOIA requests at all levels.
On a different level, online connections have
also made the process of requesting records and
delivering records simpler. Firing off requests
for records at multiple agencies at all levels of
government is much easier given internet technology. At the other end, those fulfilling such
requests (and considering whether to fight them
or not by claiming an exemption) should find it
much simpler to just honor the request by pushing a few buttons rather than try to establish and
perhaps get involved in a dispute over whether
exemptions apply. Indeed, the logical solution for
many agencies receiving multiple requests for the
same data would be to simply publish it to the
net where anyone can access it at will. The ease
of use aspect is unlikely to enhance privacy as it
makes increased openness so effortless.
But the main issue is that extensive organizational and consumer data is held in federal, state,
and local databases. If one is concerned with the



security of proprietary or personal information,


how secure is it? Especially in light of what we
have been discussing concerning FOIA activity?
And especially in light of the explosive growth
of the Internet over the past decade?

foIA and commercial data


Given the regulatory function performed by
government at all levels, it is no surprise that a
wealth of proprietary commercial data is gathered
and held by agencies. Further, other government
activities generating information may also create
interesting databaseswith records and submissions from court proceedings, for example. And
even though courts are theoretically not covered
by FOIA, if the Justice Department or another
agency is involved in a court case, the records
can be subject to the law.
As noted earlier in this chapter, this vast trove
of data has generated quite a bit of interest in
terms of FOIA requests. And the attention has not
come from the press or other open government
advocates, it has tended to come from business
(chiefly competitors), litigants, and others seeking
advantage from the records. OReilly (1982) notes
examples such as Suzukis established program to
harvest all Toyota filings with the federal government since 1981 (for data not available in Japan),
the openness of records on anyone seeking to
contract with the government, foreign government subsidized searches by bearing and aircraft
manufacturers, and a food processors innovative
filtration system retrieved from a regional EPA
office. Indeed, EPA filings are a well-known source
of detailed information on proprietary processes,
as analysts can discern the types of machines
employed, anticipated production volumes, and
so forth,. from emission and other projections
(Rice, 2000).
Also noted earlier was Exemption 4 in the
FOIA for trade secrets or other confidential business information (CBI). What constitutes CBI is
fairly well established by precedent, with the case

Government Stewardship of Online Information

National Parks v. Morton instituting the language


commercial or financial matter is confidential
for purposes of the exemption if disclosure of the
information is likelyto cause substantial harm to
the competitive position of the person from whom
the information was obtained (Kilgore, 2004).
But as may be gathered from the high level of
commercially motivated FOIA requests, the actual
workings of the exemption are not perceived as a
strong protection mechanism.
Indeed, several issues exist. Establishing
whether a disclosure would cause substantial
harm to the competitive position is tricky and
is generally left up to agency personnel. While
a number of firms designate records as proprietary, confidential, sensitive, or otherwise when
submitting, it is really up to agency personnel
(and, perhaps, outside evaluators brought in to
assess) whether a record is truly CBI. Further,
if a firm believes that it would be harmed by
disclosure, it must establish that the commercial
harm is immediate, essentially that revealing the
records would result in instant marketplace results
(OReilly, 1982). That, of course, is extremely
difficult, especially with innovative practices,
products, customer insights, and such.
Further, the exemptions are not mandatory.
Agencies are allowed to deny FOIA requests but
are not required to do so. The decision really is up
to agency personnel. Not surprisingly, procedures
vary widely by agency (Kilgore, 2004). The FAA
and several other agencies have separate public
and non-public files for CBI records. The NHTSA
will inform a submitter whether CBI is honored
or not, but if not allowed, the agency will then
publish the record without allowing withdrawal.
The Department of Labor requires each sheet be
labeled confidential and then has a Director of
the Office of Trade Adjustment Assistance determine whether the CBI exemption applies or not. In
short, some agencies have established predictable
procedures while others appear to be totally ad
hoc (Kilgore, 2004; OReilly, 1982).

A further complication is the changing nature of what might be considered CBI. Standard
intellectual property such as patents, copyrights,
and such have always had some sense of protectionall are revealed anyway, though additional
information covering how patented technology
works best, and so forth, might not be. Intellectual
property clearly has some commercial value but
also has specific and generally effective protection, even when revealed. Trade secrets and more
loosely defined or protected information are a
more sticky issue. Trade secrets, by definition, are
secret and must be kept so in order to have value.
One interesting development in recent years has
been the firming up of trade secret law in the U.S.,
as the Economic Espionage Act more explicitly
established what a trade secret is (basically, any
business information of value) and what must be
done to keep it so (strong attempts to maintain its
secrecy) (Carr, Erickson, & Rothberg, 2004).
The EEA and its interpretation create a couple
of interesting issues regarding commercial information and the FOIA. Initially, what might be
considered CBI or a trade secret is wider than
what had been standard practice in the past. In
the enforcement of the EEA, quite a number of
prosecutions have centered on marketing data,
including customer lists and consumer information. We discussed casinos and their mountains
of customer data earlier in the chapter. Harrahs
customer database, the state of the art in this
industry, can undoubtedly now be classified as a
trade secret. It might not have been a few years
ago. Secondly, the onus is really on the holder of
the trade secret to keep it secret. Sloppy protection
mechanisms, loose internal and external security,
and other such actions can invalidate trade secret
status. The holder must have proper procedures
to keep the information hidden. After objecting
to some new product information being posted by
bloggers, Apple, for example, was basically told
tough luck by the courts, at least in part because
it showed loose controls when the information got
out in the first place (OGrady, 2006).



Government Stewardship of Online Information

So it is quite possible that records turned over


to a government agency end up being released,
invalidating trade secret protection because the
information is no longer secret and/or revealing
it to the government may be seen as an indicator
of lack of controls. Further, there is no guarantee
right now that any particular agency will recognize
information as CBI or a trade secret, that it will
handle it carefully, or that it will choose to utilize
an exemption to the benefit of the submitter.
And, finally, in terms of consumer privacy,
what of individuals who might have personal
information stored within such records and/or
others the government has accumulated? They
obviously bear some risk here, too, and so we
turn to that topic.

foIA And IndIvIduAls


Federal, state, and local governments hold a great
deal of data concerning individuals, including
some coming from commercial enterprises.
While issues of technological or operational trade
secrets are less in this arena, deep knowledge
of individual consumers is a growing source of
competitive advantage and is unique, valuable,
proprietary information for organizations. And
this can pop up in government databases as financial information, medical records, demographic
information, and the like are gathered and held
at various levels.
There is great interest in accessing this sort
of information. Certain government databases
can be valuable in and of themselves (e.g., census
data) or can become valuable when combined
with other databases (e.g., housing purchases,
financial data, and tax data). But balancing that
interest are obvious concerns about personal
privacy (Gomez-Velez, 2005). While the government has a right and, perhaps a duty, to provide
and/or sell some data (Leahy, 1998; Perritt, 1995),
these personal privacy issues are also important.
As discussed earlier in this chapter, privacy
rights may be somewhat nebulously defined, but

8

there are a substantial number of individuals and


organizations who value them and are nervous
about their own rights in the age of the internet
(Gomez-Velez, 2005).
So there is information in government databases, some from commercial sources, concerning
individuals. And there are privacy issues. How
do these records pertain to FOIA? Initially, note
that any privacy promises made by corporations
before passing along data are worth very little.
Once the Internet boom collapsed in the early
part of this decade, one of the few remaining assets of bankrupt technology companies was their
customer database. And even though promises
had been made to individuals not to release their
personal information to others, the rights of debtholders seeking to sell off such assets appeared
to take precedence in at least one case (Kelly &
Erickson, 2004). Although this toysmart.com
case was never fully adjudicated, the direction
was clearpersonal privacy promises made by
organizations had very little clout in court when
balanced against other interests. Thus, if such
data ends up held by the government, the privacy
promises of the submitter likely have very little
bearing on whether it will be shared or not.
In terms of FOIA itself, there are again specific exemptions relating to personal records and
privacy. Exemption 3 concerns personal privacy
when exempted by law while Exemption 6 has to
do with personnel and medical records. Both have
been used repeatedly over the years. In fact, agencies developed something of a standard approach,
reviewing the balance between openness and
personal privacy in making FOIA decisions. Essentially, if the records held details of identifiable
individuals, agencies often declined to honor FOIA
requests and were supported by the courts (Cate
et al., 1994). Indeed, in specific cases involving
law enforcement (release full investigative records
or withhold those with individually identifiable
information), medicine (performance records of
individual doctors), and medical records (several
cases), courts affirmed that the individually iden-

Government Stewardship of Online Information

tifiable perspective trumped the publics need to


know (Prime, 1996; Sharrott, 1992; Harvard,
2007). Most of this is credited to the Reporters
Committee decision discussed earlier in this chapter. The court decision, through the central purpose concept, lessened the weight of the publics
right to know. If the records are not central to
the agencys purpose and operation, the need for
openness diminishes. Reporters also strengthened
privacy considerations, establishing individually
identifiable data as having a legitimate privacy
interest and extending the personnel and medical
record exemption to all personal information (Cate
et al., 1994). Openness in government advocates
objected strenuously, and the EFOIA amendments
in 1996 were seen as something of an answer to
Reporters, reestablishing the primacy of openness
in the balance (Halstuk & Davis, 2002; Perritt,
1998), but the case still acts as the basis for a lot
of agency decisions to decline FOIA requests on
the basis of personal privacy.
Also a help, at least to privacy advocates, was
the Privacy Act of 1974. The law gave individuals
more power over their personal records held by
the government and ostensibly required agencies
to deny FOIA requests for individually identifiable information. In large part, the law helped.
But there is a bit of circularity to it (not release
unless required by law) so that an official giving
more credence to FOIA (the very law requiring
release) might still choose to honor the request.
In other words, do not release unless required by
law, but since FOIA is a law, privacy might not be
respected. Again, the act supported privacy but
has not proved to be the final answer (Susman,
1988; Johnston, 1983).
So is there, then, a privacy concern about FOIA
and individual citizens? There are still issues to
be worked out, on a number of fronts. Initially,
as with proprietary commercial data, FOIA exemptions related to personal privacy are allowed,
not required. So whether to be concerned about
individually identifiable records is totally up to
an individual agency. Some have processes for

review and standards for determining this balance between openness and privacy. Others still
remain less concerned and review requests based
on such concerns in an ad hoc manner, if at all. As
long as different agencies have different attitudes,
approaches, and eventual decisions, individuals
will need to remain concerned about their personal
privacy. And as we discussed relating to Internetgathered data and RFID data, information can be
gathered unobtrusively, without individuals even
knowing they have became part of a file, perhaps
held by a government agency that routinely honors
FOIA requests.
Further, the joining of data is a major part
of the problem, as both firms and government
entities build ever bigger databases by combining holdings. What could very well happen is
government data without individually identifiable features might be released under FOIA and
then combined with a database with extensive
personal details. With modern technology, it
would not be that great a chore to match up some
details in each database and add the government
data on a person-by-person basis. So the original government database would have no call to
be withheld for reasons of personal privacy, but
would be personally identifiable when combined
with other data.
Finally, and somewhat adding to this general
issue, FOIA is supported by 51 FOIL statutes in
the states and District of Columbia. While very
similar in many respects, the state laws do not
precisely mirror the federal statute. Indeed, they
tend to vary considerably by state (Tesler, 2000;
Westin, 1996; Vaughn, 1984) and, of course, in
their actual administration and specific judicial
guidance. State and local agencies are likely to
have far less guidance in terms of how to process
and evaluate FOIL requests, and, of course, are
far less likely to have established procedures,
openness or privacy officials, or any other such
systems. Bloom (2006) discusses at length the
case of Greenwich, CT, recently decided in Connecticut Supreme Court. The court required the

9

Government Stewardship of Online Information

city to hand over voter registration records, a


geographic information system (GIS) database,
and a recreation management database to a computer consultant, the GIS records being subject
to a denied FOIA request. As Bloom puts it, the
court almost blindly supports the most far-reaching interpretation of the FOIA without giving
adequate weight to the countervailing legal and
policy considerations. As long as decisions are
made on an agency by agency, state by state,
municipality by municipality, and/or court by
court basis, the exact balance between openness
and privacy will tend to vary widely. Individual
citizens will never quite know how their personal
information stored in government databases will
be treated.
To summarize, the U.S. still has agencies at
all levels holding and practicing some discretion
over whether to use FOIA exemptions (or not)
and widely varying procedures for making such
determinations. Advances in database technology
mean that records not individually identifiable
in one database may be easily identified person
by person in another. And state and local laws,
agency procedures, and court decisions also vary
widely in terms of the openness/privacy tradeoff.
The vast amounts of personal information held
by various levels of government on U.S. citizens
are not protected from privacy concerns. In some
ways, matters have gotten better from a privacy
standpoint, but in a number of ways they have gotten worse. Firms and individuals cannot act with
surety regarding valuable proprietary or personal
information in government hands, so there is still
considerable room for improvement.

ment. If it is not to be released, they will likely


submit much more information and do so more
willingly. It is the certainty that is the issue.
For individuals, there also needs to be more
clarity. What information is held, who holds it, and
is it subject to release to others? Is it personally
identifiable or not, or could it be made identifiable in combination with other records? As noted
throughout this chapter, the presumption of the
federal legislature has always been that government openness was the priority, with privacy a
secondary concern. With modern databases, the
wealth of information held by either governments
or private organizations, and the possibilities of
joining records; privacy is a much greater concern
than was the case forty years ago when the FOIA
was enacted. This has been reflected in some
agency behaviors and some court decisions, but
a more sweeping and clear privacy statute would
help everyone know what is fair game for FOIA
requests and what should be kept private.
The European Union, for example, has a situation opposite that in the U.S. The U.S. case, as
we have discussed, includes fairly well-defined
open government statutes and court decisions but
fuzzy definitions of privacy and privacy rights. In
the EU, privacy rights are well-defined while open
government varies by country and a full-fledged
FOI statute only became a concern with the advent
of the Maastricht treaty in the 1990s (Perritt &
Lhulier, 1997). Consequently, the EU is seen as
having stronger privacy protection for its citizens
and considerably less uncertainty for individuals,
firms, and government. The EU Data Protection
Directive includes, among other things:

recommendations

The situation calls for more clarity at all levels.


For business, the situation is not so much whether
proprietary information might be released or not. If
it is to be released, for sure, then individual firms
can at least establish their own policies in terms
of what records will be transferred to the govern-

0

Requirements of data quality (accuracy and


timeliness)
Prohibitions on processing sensitive data
Required notification of the data subject
Limitations on disclosures to others
Rights of the data subject to object to certain
processing

Government Stewardship of Online Information

Requirements of levels of security (Maxeiner, 1995)

This is not to say there is anything magic about


the EU approach, but it does provide individuals
with more knowledge of the data held concerning
them and what is done with it. And, because the
directive applies to both government and private
entities, the problems of joining are lessened as
well. Regardless of the specific approach, it is the
certainty of the definitions and policies that would
help everyone more than the current uncertainty
at all levels.

future trends & conclusIon


This chapter has focused on several trends, so
they should already be clear. But let us reiterate.
A tension between openness in government and
personal privacy exists in the U.S. because of the
federal Freedom of Information Act and universal
state Freedom of Information Laws. The trend
in the federal legislature has been to continually
strengthen the FOIA and openness by reaffirming
a presumption that government records should be
released unless there is a compelling reason not
to. Alternatively, the trend in agency practice and
the courts has been toward more privacy, allowing
use of certain exemptions in the FOIA to deny
records to individuals or organizations seeking
them. This balance has been clarified somewhat by
legislation on electronic records, agency practice,
and a number of court cases suggesting agencies
can limit releases to central purpose activities
and records not including individually identifiable information.
Prominent trends have surfaced over the last
10 years, in terms of the databases and privacy.
In the former case, the range, depth, processing
and joining capabilities found in contemporary
electronic databases can be mind-boggling. This
fact has contributed to a growing interest in personal privacy, as individuals and organizations

begin to realize exactly how much information


is out there concerning them and how it might
be used. Because of the interest in privacy and a
corresponding weakness in defining the field and
individual rights, the topic is an important one
vis a vis FOIA. Governments hold tremendous
amounts of their own records, as well as those obtained from corporations, and those that could be
created from further joining or analysis. As such,
the status of proprietary commercial information
and other personal information held by various
levels of government is a concern. As databases
and their capabilities continue to grow, these
concerns will only grow without more clarification of privacy rights.
One group of commentators favors strong
FOIA statutes, practice, and court support. Another favors weaker FOIA but stronger privacy
considerations. Which way the balance goes is less
of a concern than establishing certainty so that
organizations and individuals know what steps
to take regarding their proprietary or personal
information. Knowing how the data will be used
and protected can influence organizations deciding whether to do business with various levels of
government. Similarly, such knowledge can help
individuals decide whether to pursue some sorts
of licenses, invest in property, or even register to
vote. More certainty will help individual decision
making and, by extension, the governments ability to serve the needs of its constituents.

future reseArch
As reiterated often in this chapter, this is a constantly changing field with new decisions made
on FOIA and FOIL applications on a daily basis
at all levels of government. There are new statutes and new court decisions constantly flowing
from various legislatures and courts. With each
major change, opportunities for new research
are created.



Government Stewardship of Online Information

In terms of immediately relevant and predictable research directions, however, we have the
obvious differences in FOIA/FOIL practice that
we have discussed. Initially, at the federal level,
different agencies can have dramatically different
procedures and responses to FOIA requests. Some
have officers or groups in charge of processing and
determining whether exemptions apply. Millions
of dollars are spent at the federal level answering hundreds of thousands of annual requests.
Analysis of differences in agency approach and
process could be fruitful research directions and
help our understanding of the benefits and burdens
of the FOIA.
Similarly, at the state level, laws, agency behavior, and processes differ markedly. There is some
research on differences in statute and court decisions, but the field is again ripe for more detailed
examinations of specific agencies, approaches,
staffing, and so forth. In line with that theme, as
freedom of information concerns become more
formalized in Europe and elsewhere, similar
research directions could prove fruitful.

references
Apfelroth, J. (2006). The open government act: A
proposed bill to ensure the efficient implementation of the freedom of information act. Administrative Law Review, 58(1), 219.
Bloom, I. (2006). Freedom of information law in
the digital age: The death knell of informational
privacy. Richmond Journal of Law & Technology,
12(Spring), 9.
Brenner, S. W. (2005). The search and seizure of
computers and electronic evidence: The fourth
amendment in an era of ubiquitous technology.
Mississippi Law Journal, 75(Fall), 1.
Brenner, S. W., & Clark, L. L. (2006). Fourth
amendment protection for shared privacy rights
in stored transactional data. Journal of Law and
Policy, 14, 211.



Carr, C. A., Erickson, G. S., & Rothberg, H. N.


(2004). Intellectual capital, competitive intelligence and the economic espionage act. International Journal of Learning and Intellectual
Capital, 1(4), 460-482.
Cate, F. H., Fields, D. A., & McBain, J. K. (1994).
The right to privacy and the publics right to
know: The central purpose of the freedom of
information act. Administrative Law Review,
46(Winter), 41-74.
Dalal, R. S. (2006). Chipping away at the constitution: The increasing use of RFID chips could lead
to an erosion of privacy rights. Boston University
Law Review, 86(April), 485.
Eden, J. M. (2005). When big brother privatizes:
Commercial surveillance, the privacy act of 1974
and the future of RFID. Duke Law & Technology
Review, 20.
Gomez-Velez, N. (2005). Internet access to court
recordsbalancing public access and privacy.
Loyola Law Review, 51, 365-438.
Halstuk, M. E. (2005). When is an invasion of
privacy unwarranted under FOIA? An analysis
of the supreme courts sufficient reason and
presumption of legitimacy standards. University of Florida Journal of Law & Public Policy,
16(3), 361-400.
Halstuk, M. E., & Davis, C. N. (2002). The public
interest be damned: Lower court treatment of
the reporters committee central purpose reformulation. Administrative Law Review, 54(3),
983-1024.
Harvard Law Review. (2007). Developments in
the law of media. 120(4), 990.
Hildner, L. (2006). Defusing the threat of RFID:
Protecting consumer privacy through technology-specific legislation at the state level. Harvard Civil Rights-Civil Liberties Law Review,
41(Winter), 133.

Government Stewardship of Online Information

Johnston, C. A. (1983). Greentree v. united


states customs service: A misinterpretation of
the relationship between FOIA exemption 3 and
the privacy act. Boston University Law Review,
63, 509-531.
Kelly, E. P., & Erickson, G. S. (2004). Legal and
privacy issues surrounding customer databases
and e-merchant bankruptcies: Reflections on
toysmart.com. Industrial Management & Data
Systems, 104(3), 209-217.
Kilgore, H. E. (2004). Signed, sealed, protected:
Solutions to agency handling of confidential
business information in informal rulemaking.
Administrative Law Review, 56(2), 519-534.
Leahy, P. (1998). The electronic FOIA amendments of 1996: Reformatting the FOIA for online access. Administrative Law Review, 50(2),
339-344.
Levary, R., Thompson, D., Kot, K., & Brothers,
J. (2005). Radio frequency identification: Legal
aspects. Richmond Journal of Law & Technology, 12(Fall), 6.
Lurie, P., & Zieve, A. (2006). Sometimes the
silence can be like thunder: Access to pharmaceutical data at the FDA. Law and Contemporary
Problems, 69(Summer), 85-97.
Maxeiner, J. R. (1995). Freedom of information
and the EU data protection directive. Federal
Communications Law Journal, 48(December),
93-104.
Mulligan, D. (2004). Privacy and information
goods. FTC RFID Workshop, June 21. www.ftc.
gov/bcp/worshops/rfid
OGrady, et. al. v. Superior Court (2006), 139
Cal. App. 4Th 1423.
OReilly, J. T. (1982). Regaining a confidence:
Protection of business confidential data through
reform of the freedom of information act. Administrative Law Review, 34, 263-313.

Perritt, Jr., H. H. (1995). Sources of rights to access public information. William & Mary Bill of
Rights Journal, 4(Summer), 179-221.
Perritt, Jr., H. H. (1998). Electronic freedom of
information. Administrative Law Review, 50(2),
391-419.
Perritt, Jr., H. H., & Lhulier, C. J. (1997). Information access rights based on international
human rights law. Buffalo Law Review, 45(Fall),
899-929.
Prime, J. S. (1996). A double-barrelled assault:
How technology and judicial interpretations
threaten public access to law enforcement records. Federal Communications Law Journal,
48(March), 341-369.
Rice, S. (2000). Public environmental recordsa
treasure chest of competitive information. Competitive Intelligence Magazine, 3(3), 13-19.
Rothberg, H. N., & Erickson, G. S. (2005). From
knowledge to intelligence: Creating competitive
advantage in the next economy. Woburn, MA:
Elsevier Butterworth-Heinemann.
Sawyer, S., & Tapia, A. (2005). The sociotechnical nature of mobile computing work: Evidence
from a study of policing in the United States.
International Journal of Technology and Human
Interaction, 1(3), 1-14.
Sharrott, D. (1992). Provider-specific qualityof-care data: A proposal for limited mandatory
disclosure. Brooklyn Law Review, 58(Spring),
85-153.
Solove, D. J. (2005). The coexistence of privacy
and security: Fourth amendment codification
and professor kerrs misguided call for judicial
deference. Fordham Law Review, 74(November),
747.
Susman, T. M. (1988). The privacy act and the
freedom of information act: Conflict and resolution. John Marshall Law Review, 21, 703-733.



Government Stewardship of Online Information

Tesler, W. (2000). Gould debunked: The prohibition against using New Yorks freedom of information law as a criminal discovery tool. New York
Law School Law Review, 44, 71-129.
Uhl, K. E. (2003). The freedom of information act
post-9/11: Balancing the publics right to know,
critical infrastructure protection, and homeland
security. American University Law Review,
53(October), 261-311.
Vaughn, R. G. (1984). Administrative alternatives
and the federal freedom of information act. Ohio
State Law Journal, 45(Winter), 185-214.
Westin, M. (1996). The Minnesota government
data practices act: A practitioners guide and observations on access to government information.
William Mitchell Law Review, 22, 839-902.

AddItIonAl reAdIng
Andrussier, S. E. (1991). The freedom of information act in 1990: More freedom for the government,
less freedom for the people. Duke Law Journal,
41(June), 753.
Beall, C. P. (1996). The exaltation of privacy
doctrines over public information law. Duke Law
Journal, 45, 1249.
Bunker, M. D., Splichal, S. L., Chamberlin, B. F.,
& Perry, L. M. (1993). Access to government-held
information in the computer age: Applying legal
doctrine to emerging technology. Florida State
University Law Review, 20(Winter), 543.
Davis, C. N., & Splichal, S.L. (Eds.). (2000). Access
denied: Freedom of information in the information
age. Ames, IA: Iowa State University Press.
Grunewald, M. H. (1988). Freedom of information act dispute resolution. Administrative Law
Review, 46, 1.



Halstuk, M. E. (2002). The threat to freedom


of information. Columbia Journalism Review,
40(5), 8.
Heinrich, C. (2005). RFID and beyond: Growing
your business through real world awareness.
Indianapolis, IN: Wiley.
Hildebrand, M. J., & Klosek, J. (2005). Recent
security breaches highlight the important role of
data security in privacy compliance programs.
Intellectual Property & Technology Law Journal, 17, 20.
Hostetter, D. Z. (2005). When small technology
is a big deal: Legal issues arising from business
use of RFID. Shidler Journal of Law, Commerce
& Technology, 2, 102.
Kelly, E. P., & Erickson, G. S. (2005). Radio frequency identification tags: Commercial applications vs. privacy rights. Industrial Management
and Data Systems, 105(5/6), 703-713.
Kobelev, O. (2005). Big brother on a tiny chip:
Ushering in the age of global surveillance through
the use of radio frequency identification technology and the need for legislative response. North
Carolina Journal of Law and Technology, 6,
325.
McDonald, D. (1997). The electronic freedom of
information act amendments: A minor upgrade to
public access law. Rutgers Computer & Technology Law Journal, 23, 357.
Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217-232.
Norian, P. (2003). The struggle to keep personal
data personal: Attempts to reform online privacy
and how congress should respond. Catholic University Law Review, 52, 803-806.
Nowadzky, R. A. (1996). A comparative analysis
of public records statutes. Urban Law, 28, 65.

Government Stewardship of Online Information

OReilly, J. T. (1998). Expanding the purpose of


federal records access: New private entitlement
or new threat to privacy? Administrative Law
Review, 50, 371.
Pandozzi, N. R. (2001). Beware of banks bearing
gifts: Gramm-Leach-Bliley and the constitutionality of federal financial privacy legislation.
University of Miami Law Review, 55, 163.
Perritt, Jr., H. H. (1990). Federal electronic information policy. Temple Law Review, 63, 201.
Siris, M. J. (1997). New Yorks freedom from
information law: Dislosure of public costs of a
New York state senators public interest mailings. Albany Law Review, 60, 1273-1294.
Smith, L. (2006). RFID and other embedded
technologies: Who owns the data? Santa Clara
Computer & High Technology Law Journal, 22,
695.

Solove, D. J. (2001). Privacy and power: Computer


databases and metaphors for information privacy.
Stanford Law Review, 53, 1393-1402.
Solove, D. J. (2002). Access and aggregation:
Public records, privacy and the constitution.
Minnesota Law Review, 86, 1137.
Terwilliger, G. (2006). EU data protection laws.
National Law Journal, 28(37), 18.
Wald, P. M. (1984). The freedom of information
act: A short case study in the perils and paybacks
of legislating democratic values. Emory Law
Journal, 33, 649.
Wichmann III, C. J. (1998). Ridding FOIA of
those unanticipated consequences: Repaving a
necessary road to freedom. Duke Law Journal,
47, 1213-1256.





Chapter XVII

The Legal Framework for Data


and Consumer Protection in
Europe
Charles OMahony
Law Reform Commission of Ireland, Ireland
Philip Flaherty
Law Reform Commission of Ireland, Ireland

AbstrAct
This chapter will discuss the legal framework for consumer and data protection in Europe. Central to this
discussion will be the law of the European Union (EU) on data and consumer protection.3 Recent years
have seen the creation of legal frameworks in Europe which seek to secure the protection of consumers
while simultaneously facilitating economic growth in the European Union. This chapter will outline the
main sources of law which protect consumers and their privacy. This chapter will outline the important
provisions in these sources of law and critically analyse them. The chapter will also point up the gaps
and deficiencies in the consumer and data protection legal structures.

consumer protectIon
There is a need for commercial law to respond to
the challenges posed by technology and the means

by which technology has affected commerce.


There is a specific need for law to respond to the
challenges posed by e-commerce. The proliferation of the Internet and the expansion in the use

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

The Legal Framework for Data and Consumer Protection in Europe

of e-mail has become increasingly important for


business in terms of the sale of goods and services
and also in terms of marketing.
It is well recognised that e-commerce has the
possibility of bringing major benefits to consumers. In particular, e-commerce has the potential
to promote competition among suppliers of goods
and services. It also has the possibility of permitting businesses to build up new mutually beneficial
relationships with consumers. From a European
perspective, there is a possibility for consumers
and service providers to benefit from an internal
market and cross border business transactions.
Despite these beneficial possibilities, new challenges and risks for consumer exist. In particular,
there is a concern with protecting consumers who
are increasingly engaged in cross-border business
transactions. The law must endeavour to adapt to
a medium which is not limited by frontiers. This
throws up many problems such as the choice of
jurisdiction in the case of an online consumer
contract between two countries.
Another challenge identified by the European
Commission for e-commerce is that there is a trust
issue faced in achieving brand recognition by
consumers and a problem for businesses becoming
commercially viable and sustainable.4 The onward
march of globalisation and new markets has left
political and legal systems both at a national and
international level struggling to adapt their laws to
them. The new regulatory framework established
by the EU for e-commerce is an important step at
a regional level to ensure that the law keeps apace
with these developments (Wahab, 2004).

the europeAn unIon And


legIslAtIve InItIAtIve
The European Union (EU) has become the driving
force for the initiation of policy development in
many diverse areas for its member states. Data
and consumer protection are cogent examples of
this. The concept of European Union can be con-

fusing, therefore it is perhaps best at this point to


describe the method by which the EU can direct
this change.
The EU was created as recently as 1993 by the
Treaty on European Union. The EU is an overarching entity which contains three-pillars, the
pre-existing European community (EC) pillar, the
justice and home affairs pillar and the common
foreign policy and security policy pillar.5 The
concept of a common market is a fundamental
aim of the European Union and its policies have
been heavily influenced by this. Under Article 211
EC, the European Commission has been given
extensive powers such as its right of legislative
initiative to ensure the proper functioning and
development of the common market. There are
three modes of community lawmaking employed
by the commission: regulations, directives, and
decisions for a more lengthy discussion on community law-making (Craig & de Brca, 2003).
Regulations are binding upon and are directly
applicable in all member states. Directives may not
be applicable to all member states and are binding
only on the end to be achieved and leave flexibility
with the member states on the mode by which this
end is achieved. Decisions are binding in their
entirety on those to whom they are addressed
under Article 249 (ex Article 189) EC.
The European Union has provided a legal
framework for e-commerce. The creation of this
framework through a series of directives tied in
with the goal of creating a European single market
(Directive 2000/31/EC [2000] O.J. L178/1 and
Directive 1999/93/EC [1999] O.J. L13/12). The
rationale of the European Commission in bringing
forward these directives was to inspire confidence
in e-commerce. These directives were introduced
to facilitate the development and expansion of
e-commerce and the content of these directives
do not limit or restrict e-commerce (Keller &
Murray, 1999; Anassutzi, 2002).
These directives primarily dealt with the following issues:



The Legal Framework for Data and Consumer Protection in Europe

The legal status of electronic contracts and


digital signatures
The legal liability of intermediary service
providers
The freedom to provide services

Essentially, the E-commerce Directive


2000/31/EC sought to establish a structure in
which providers of e-services would be free to
provide such services throughout the European
community. The 1998 directives provide that a
service provider will not be legally liable. The
E-commerce directive under Article 9 requires
member states to ensure that their domestic laws
permit contracts to be completed online. As a result
of this, contracts can be concluded electronically
and will have legal effect in all member states of
the European Union.
The directives from the EU aimed also at
promoting confidence in e-commerce. Measures
specifically aimed at promoting confidence
include a requirement on service providers to
formally acknowledge a consumers order under
this provision is contained under Articles 5 and
6 of the E-commerce Directive. The directive
requires that the service provider acknowledgement be electronic and requires that it must be
issued without delay.
These measures complement consumer protections set out in the 1997 Distance Selling Directive
and other EC regulations on commercial law.
The European framework for e-commerce
is still developing in Europe and as such there
are still fundamental questions that need to be
answered.

e-commerce And the


europeAn unIon
The European Union has had a policy of promoting
and developing e-commerce. The rationale for this
policy has been to make the European Union the
most competitive and dynamic knowledge-based
economy in the world.
8

The European Commission produced Directive 2000/31/EC in 2000, which established the
basic legal framework for electronic commerce
in the internal market. This directive removed
obstacles to cross border online services in the
European Union. Importantly, the directive also
provided legal certainty to business and citizens of
the European Union. The European Commission
introduced the directive with a view to creating
a legal framework that ensures:

That the information society services


benefit from internal market principles of
free movement of services and freedom
of establishment in an enlarged European
Union;
Monitoring and following up of Directive
2000/31/EC (including regular reports on
its application);
The correct implementation of the directive by new and existing member states
and application of the legal framework for
electronic commerce as provided by Directive 2000/31/EC;
Appropriate following up of complaints
concerning information society services;
and
General monitoring of legal, technical, and
economic developments of electronic commerce and the Internet.

The EU has also recognised that there is a


need for dispute resolution for what effectively is
a new means of commerce. As Rule (2002) notes,
alternative dispute resolution or online dispute
resolution has been viewed as a more cost effective
means to deal with Internet disputes similar to
endeavours of the U.S. Federal Trade Commission. Emphasising the need to build consumer
confidence in e-commerce, the EU Commission
has worked on developing European extra-judicial
network (EEJ-Net), an umbrella group linking all
pre-existing ADR systems in individual member
states. The advantages of ADR based solution

The Legal Framework for Data and Consumer Protection in Europe

for online contracts were also recognised by the


OECD.

e-commerce dIrectIve
It was recognised in the mid 1990s that the Internet,
which was once only accessible to a privileged few,
was becoming increasingly popular as a means
of communication and of doing business. The
challenge for the EU was to design a regulatory
framework which would protect parties doing
business online but also to enable EU citizens to
take advantage of new markets making the EU
a true knowledge-based economy. See European
Commission, Directorate General Press and Communication (2002) Towards a knowledge-based
Europe.
The need for a regulatory framework for ecommerce was first officially recognised in 1997
when the commission adopted its communication
A European initiative on electronic commerce
by the IP/97/313 (16/04/1997). Single market
principles were recognised as being central to
the continued growth and regulation of the sector, to ensure maximum efficiency and certainty.
The commission identified four key areas where
action should be taken.
Firstly it was recognised that there would
need to be standardised systems and upgraded
technological infrastructure within the EU. Secondly, perhaps most importantly, single market
principles were found to be integral to any future
regulatory framework in the area. Thus the fundamental freedoms such as the free movement of
goods, services, people, capital, and the freedom
of established would guide any future developments in the area. Therefore, it was made clear
that single market principles would be grafted
on to the emerging Internet market and would
govern the variety of important issues such as:
distance selling, data protection, electronic signatures, and electronic payments. Importantly,
it was also stated that the need for an analogous

international regulatory framework would need


to be explored, in a way that would reflect the
needs of EU member states.
To this end, the Electronic Commerce Directive which was adopted in 2001 established an
internal market framework for electronic commerce.6 The directive established harmonised
rules on issues such as the transparency and
information requirements for online service providers, commercial communications, electronic
contracts, and limitations of liability of intermediary service providers.

the InternAl mArket And the


country of orIgIn prIncIple
Article 3 (2) of the E-commerce Directive ensures
that the freedom to provide service extends to
the provision of information society services:
Member States may not, for reasons falling
within the coordinated field, restrict the freedom
to provide information society services from
another member state.
The E-commerce Directive establishes a country of origin rule for regulation of the provision of
online services within the EU. These services have
been defined as information society services,
which covers a wide variety of economic services:
any service normally provided for remuneration
at a distance by electronic means at a distance and
at the individual request of a recipient of services
(Article 2 of the directive refers to the definitions
in Article 1(2) of the earlier directives 98/34/EC
and 98/84/EC).
Article 3 elaborates on the country of origin
rule for the regulation of the provision of information society services: Each member state
shall ensure that the information society services
provided by a service provider established on
its territory comply with the national provisions
applicable in the member state in question which
fall within the coordinated field. Essentially, it
means that the providers of such services will

9

The Legal Framework for Data and Consumer Protection in Europe

only be subject to the rules applicable in their


own home country. This has important practical
effects as it avoids the inevitable confusion which
could ensue if providers of these services had to
be wary of the applicability of diverse laws and
compliance with different standards.
Importantly, the country of origin principle
is not a fundamental principle of EU law, it is not
present in primary or secondary sources of law
and was first enumerated in the in the Television
without Frontiers Directive, 1989 (89/552/EEC)
was limited to certain areas of TV advertising.
The term coordinated field is an important one,
essentially meaning the requirements an information service provider must comply within its
specific area of expertise such as: qualifications,
authorisation, or notification, in addition to other
requirements such as the behaviour of the service
provider and stipulations as to quality or content
(Article 2(h)(i)).
This principle has the important effect that if
service providers are bound only to the regulations
applicable in their home state, there is a greater
incentive to harmonise such regulations within
the internal market. Hornle (2005) elaborates that
the principle may lead to greater coherence of
the internal market as it requires cross-border
cooperation in enforcement as the home country
must cooperate with the country of destination.
It should be noted that, as in many areas of
EU law and policy, member states may derogate
from the prohibition on restricting the freedom
to provide information services from another
member state and may take measures to derogate
if certain conditions are fulfilled under Article
3(4). Such derogations cover diverse concerns
such as public policy and public health, it must
be noted that in keeping with European Court of
Justice case-law, the directive requires that these
measures must be proportionate to their objective
under Article 3(4)(a)(i).
Importantly, the protection of consumers,
including investors is recognised as an objective
which could lead to such derogation. Thus, if it is

0

believed by the country of destination that it has


a higher standard of protection regarding online
gambling than that of another member state, then
a derogation may be permitted (if deemed proportionate) in the protection of consumers. Others
believe that ultimately the directive is a political
compromise and may be difficult for lawyers to
work with in reality.
The exception of online consumer contracts
from the directive means that a maze of pre-existing rules covers the choice of jurisdiction in this
area. It became clear that greater simplification
was needed with the increase in e-contracts;
this led to the promulgation of the Jurisdiction
Regulation in 2000.

the JurIsdIctIon regulAtIon


And the country of
destInAtIon prIncIple
The E-commerce Directive laid down a framework
for future regulation and thus may be termed public
law as it does not cover private law issues such as
consumer contracts. To this end, the EU approved
the Jurisdiction Regulation in December 2000.
The regulation replaces and expands on the Brussels Convention on Jurisdiction and Recognition
of Foreign Judgments. This convention had as its
aim to ensure that nationals of member states could
ensure that the rules guaranteeing the enforcement
of judgments of awards would be simplified. This
was important to strengthening the internal market
by ensuring greater uniformity jurisdiction rules
including consumer contracts.
Article 15 of the regulation essentially provides that the consumer may avail of the
laws of his member state and consequently
may be sued in that member state. Article
15(3) extends this to contracts conducted
through the Internet.
Article 16(1) of the regulation importantly
provides that the consumer may avail of
the laws of the member state in which he is

The Legal Framework for Data and Consumer Protection in Europe

domiciled or the member state in which the


other party is domiciled.
Article 17 provides that these provisions
may be departed from by an agreement of
both parties. This agreement can be entered
into after the dispute has arisen, or if both
parties are domiciled/habitually resident in
the same member state, jurisdiction may
be conferred on the courts of that member
state.
Gillies (2001) pointed out that the EU has
upheld the traditional view that consumers are
the weaker party by allowing them to avail of
the laws of their own jurisdiction. This concern
is echoed in recital 13 of the regulation:
In relation to insurance, consumer contracts and
employment, the weaker party should be protected
by rules of jurisdiction more favourable to his
interests than the general rules provide for.
Correspondingly, the regulation places an
obligation on businesses who conduct their commercial activities online within the EU to comply
with the laws of all member states. Business
organisations claimed during the intensive discussions prior to the approval of the regulation that
this was effectively placing a huge administrative burden on small-medium sized businesses to
ensure that they were conducting their affairs in
accordance with the laws of 15 different member
states (Wallis, 1999). However, as Oren (2003)
notes, any individual conducting commercial
cross-border transactions must take account of
the differing laws in other states and there is no
compelling reason why e-commerce should be
any different.
Critics of the regulation, such as Motion (2001)
claim that the country of origin and country
of destination principles are conflicting in the
special case of consumer contracts, meaning that
businesses must be aware of the risks in doing
business with nationals of other member states.

The complexity which currently exists between


the Jurisdiction Regulation and the E-commerce
Directive highlight the differences between personal and state cyberspace jurisdiction (Zekios,
2007). Cyberspace is a complex net of individuals
and communities which transcend boundaries and
affects territorial, state, and individual autonomy.
The tension between state and personal jurisdiction is clear from the tangle of rules concerning
jurisdiction in relation to e-contracts in the EU.

busIness conducted vIA A


Web sIte
The E-commerce Directive lays down important
rules governing the information to be provided
by the seller to the consumer on a website prior
to purchase:
The various steps the consumer must take in
order to place an order and when a contract
is completed;
How the terms and conditions of the supplier can be accepted and retained by the
consumer;
How the consumer can identify and rectify
any technical errors; and
A receipt must be sent to the consumer on
completion of the contract.

electronIc sIgnAtures
dIrectIve
The Electronic Signatures Directive 1999/93/E.C.,
a community framework for electronic signatures,
addresses the problems associated with the legal
status of electronic signatures by permitting certification service providers to establish without
advance authorisation and to act as a third party
in certifying the authenticity of a digital signature
under Article 3 of the directive.



The Legal Framework for Data and Consumer Protection in Europe

The stated aim of the directive is to promote


the proper functioning of the internal market. The
recognition of electronic signature has important
legal, technical, and commercial dimensions and
any legislation required careful consideration.
As Walden (2001) suggests, the directive forms
an important addition to the EUs e-regulation
architecture, ensuring legal certainty which is
integral to the development of e-commerce in any
jurisdiction. As a result of Article 5 of the directive,
the domestic law of the different member states
of the European Union has been harmonised to
ensure equivalence between hand written and
digital signatures and Article 5 permits the use
of these digital signatures based on certifications
as evidence in legal proceedings.
Member States are required to ensure the following under the directive:
The provision of certification services does
not require prior authorisation, but remains
subject to supervision by a relevant body;
The secure signature creation devices comply with and conform to the requirements
set out in the directive and to be controlled
by appropriate public or private bodies in
each member state;
Equivalent legal status between electronic
and hand-written signatures;
The admissibility of electronic signatures
as evidence in legal proceedings;
Certification service providers are liable for
damage caused to any entity, legal, or natural
person who relies on such certificate; and
Certification service providers must comply
with all data protection laws.

future chAllenges
To become binding within individual member
states, the directives must be transposed into



national law. The nature of a directive leaves flexibility to individual member states to transpose
the contents of a directive into law in a matter
which suits its traditions and legal system best. In
Ireland, the Electronic Commerce Act 2000 (No.
27 of 2000) implemented the Electronic Signatures
Directive and a number of provisions contained
in the E-Commerce Directive. The greater bulk
of the provisions of the E-Commerce Directive
were transposed into law by the European Communities (Directive 2000/31/EC) Regulations
2003. These regulations are enforced by the data
protection commissioner and director of consumer
affairs in Ireland. The Irish approach is a lightregulatory approach and this is regarded by the
government as the best method to regulation in
Ireland (Colgan, 2003).
The 10 new member states began transposition
prior to their accession into the EU. Poland, for
example, passed an act on July 18, 2002 relating
to the provision of services by electronic means.
The transposition of directive into the law of all
the member states of a larger EU is an important
development for the future growth of e-commerce
in the EU. This harmonisation ensures a large
and efficient e-market in the EU. However,
it has been recognised that the same perceived
flaws remain in the legislation as were identified
with the directive as the transposing law, though
flexibility must remain faithful to the directives
provisions (Kryczka, 2004). Therefore, if there
are problems with the E-commerce Directive, at
least they are uniform and therefore can be addressed by a review if it is thought it is required
in the future. It is clear from the legal framework
that has been constructed for e-commerce in the
EU that the focus has been to ensure the proper
functioning of the internal market, competitiveness, and economic growth in Europe. While this
is important, there needs to be a greater emphasis
placed on consumers and the protection that they
are afforded.

The Legal Framework for Data and Consumer Protection in Europe

sub conclusIon
The directives discussed here constitute an important contribution to the overall e-regulation
architecture. The transposition of the directives
into the national laws of member states enhances
the harmonisation of the rules governing e-commerce in Europe. This is vital to the European
economy of both the more established member
states and the new accession member states.
There is considerable unease that common law
principles are being squeezed out in favour of
civil law principles in this new application of law
to e-commerce. Roebuck contends that a greater
notice should be taken of common law approaches
as they are always more trade friendly than their
civil law counterparts (Roebuck, 2002). Flexibility
remains important and it is feared that heightened
legislative activity may inhibit this.
A strong regulatory scheme which is flexible
is important for consumer confidence. However,
certain commentators have found that the exception of consumer contracts from the country
of origin principle difficult to justify (Moerel,
2001). However, as is so often the case with resolutions adopted between states, a certain amount
of political compromise is needed which lawyers
may find difficult to work in practice. Ultimately,
e-commerce is of a global nature and the success
of the EU venture points to the greater need for a
global answer to a global regulatory problem.

dAtA protectIon
Consumers traditionally have been able to shop
anonymously with little or no intrusion into their
private lives. However, shopping online requires
consumers to divulge much more personal information than was required in the past. Even where
consumers are merely enquiring about goods
and services, they may be required to provide

personal data. While this requirement to divulge


information is normally justifiable, there are
situations where consumers reveal information
unnecessarily.
Legislation aimed at protecting the personal
data of citizens in Europe has emerged as a response to concerns with how commercial and
governmental agencies use personal data. The
law relating to data protection is often extremely
technical and overly complicated.7 Understanding
the law in relation to data protection presents challenges for consumers, businesses, and government
agencies. Despite the presence of a number of key
harmonising directives from the European Commission, the law still remains confusing.8
The law on data protection in Europe is particularly puzzling, due to the distinctive nature of
European Union law and fast paced developments.9
This chapter is going to discuss the main sources
of law relating to data protection in Europe the
most important of which is the Data Protection
Directive 1995.10

the councIl of europe


The Council of Europe was founded in 1949 and
is a separate organisation to the European Union.
The council seeks to develop in Europe common
and democratic principles based on the European
Convention on Human Rights and texts on the
protection of individuals.11 The Council of Europe
has played an important role in protecting the
personal data Europeans. The Council of Europes
European Convention of Human Rights and the
Convention for the Protection of Individuals with
regard to Automatic Processing of Personal Data
have been influential in developing the law on
data protection in Europe.



The Legal Framework for Data and Consumer Protection in Europe

the rIght to prIvAcy: ArtIcle


8 of the europeAn
conventIon on humAn rIghts
And fundAmentAl freedoms
(echr)
Article 8 of the European Convention on Human
Rights and Fundamental Freedoms (ECHR) established that Everyone has the right to respect
for his private and family life, his home, and his
correspondence.12 This right under Article 8
of the ECHR can only be restricted by a public
authority in accordance with domestic law of a
country and only in so far as it is necessary for
the defence of a legitimate objective.13
The protection for privacy provided for under
Article 8 is an important source of law in relation
to data protection, particularly as all European
states are signed up to the European Convention
on Human Rights. The European Court of Human
Rights has given a usefully broad interpretation
to Article 8 of the convention and this has been
an important factor in influencing states to ensure
an adequate level of protection of privacy in their
national laws.14

councIl of europe
conventIon for the
protectIon of IndIvIduAls
WIth regArd to AutomAtIc
processIng of
personAl dAtA 198115
This convention drew inspiration from the European Convention on Human Rights. In particular,
the convention was inspired by Article 8. The
Council of Europe, through these guidelines
established a structure of precise principles
and norms which were aimed at preventing the
unjust compilation and processing of personal
data. This work culminated in the Convention
for the Protection of Individuals with regard
to automatic processing of personal data. This



convention required contracting states to make


changes in the domestic law of their country so
as to give effect in law to the principles relating
to personal data.
The principles contained within the convention
relate mainly to:
Promoting fair and lawful collection and
automatic processing of data;
Storage of data for specified legitimate
purposes and not for ends incompatible with
those purposes;
The retention of data for no longer than is
necessary;
The confidentiality of sensitive data; and
The right of access and rectification of information.
This convention, drafted in 1981, has been
overtaken to some degree by technological developments. Technological developments have
generally resulted in benefits for consumers and
members of society in their ability to become as
the Council of Europe phrases it active agent(s) of
the information society.16 However, these developments have created new concerns particularly in
relation to privacy and the greater interference by
the information systems of numerous public and
private servicesbanks, credit services, social
security, social assistance, insurance, police, and
medical care.
The convention, while not legally enforceable,
established a number of principles, which have
been influential in the development of data protection law in Europe. However, as these principles
are not legally binding, variation resulted in the
laws on data privacy through out Europe. In
addition, the challenges posed by technological
developments have necessitated a more systematic system for ensuring privacy in Europe. The
Council of Europe has noted since the 1960s
that rapid progress in the field of electronic
data processing and the first appearance of main
frames allowed public administrations and big

The Legal Framework for Data and Consumer Protection in Europe

enterprises to set up extensive data banks and to


improve and increase the collection, processing
and interlinking of personal data.
The Council of Europe has not amended the
convention or created many new protocols to deal
with technological developments. Rather, the
council has opted to produce recommendations
to deal with new challenges. This approach is
preferred by the council as these recommendations are easier to draft and to adopt.
While these recommendations are not legally
binding on member states, they do however, set
down standards of reference for all member states,
whether they are parties to the convention or not.
The recommendations from the Council of Europe
in data protection are useful in that they request
a state to consider the possibility of expanding
and implementing domestic law in compliance
with internationally agreed interpretation of the
principles set down in the convention. This is an
effective means of promoting better regulation
in Europe ensuring that gaps and deficiencies in
data protection laws are remedied.

oecd guIdelInes on the


protectIon of prIvAcy And
trAnsborder floWs of
personAl dAtA 1980
The Organisation for Economic Co-operation
and Development (OECD) established in 1961 is
composed of governments who are committed
to democracy and the market economy. Among
the aims of the OECD are supporting sustainable
economic growth, assisting countries in economic
development, and contributing to world trade.
The OECD published Guidelines on the
Protection of Privacy and Transborder Flows of
Personal Data in (OECD, 1980). These guidelines
represented and continue to represent intercontinental consensus on regulation relating to
the compilation and administration of personal
information. The OECD guidelines have been

important and continue to be important in assisting European governments, businesses, and


consumer agencies in protecting the privacy of
personal data. The guidelines are also important
in preventing needless restrictions to transborder
data flows, both online and off-line. The guidelines
were aimed at addressing unnecessary restrictions
on data flows by providing a means of harmonising the laws of different countries. The OECD
guidelines have been influential not only in the
development of privacy protection in Europe and
but also throughout the world.

dAtA protectIon dIrectIve


95/46/ec
The Data Protection Directive was introduced in
1995 with the aim of harmonising the data protection laws within the European Union. (Member
states of the European Union were given 3 years in
which to implement the provisions of the directive
into their national law). Essentially, the directive
facilitates the movement of data throughout the
EU, while simultaneously affording EU citizens
a higher standard of protection in the domestic
law of the EU state that they were resident in.
The Data Protection Directive 1995 relates only
to computerised files containing personal data.
Under the directive, data which is not processed
automatically still come under the scope of the
directive as long as the data forms part of a filing
system. It is important to note that the directive
does not apply to activities, which are outside the
remit of EU law.
All member states of the European Union including the newer member states have enacted data
protection laws in their domestic law, which give
effect to the provisions of the 1995 directive. The
1995 directive has been complemented by Directive 2002/58/EC, which is known as the directive
on privacy and electronic communications. The
European Court of Justice (ECJ) has interpreted
the scope of the Data Protection Directive widely



The Legal Framework for Data and Consumer Protection in Europe

in its case law, see for example, the Lindqvist


decision, C-101/01 [2003] ECR I- 6041].
The Data Protection Directive increased regulation with respect to the processing of data and
the transferring of data outside the EU. The Data
Protection Directives set out clear and readily
understandable principles about how personal data
ought to be handled. The most significant result
of this directive is that it has given people rights
to challenge mishandling of their data.
The directive has removed obstacles to the
flow of data through out the European Union.
For example, businesses no longer face the difficulties previously encountered in transferring
data relating to employees from one member
state to another member state. The European
Data Protection Directive 1995 contains general
provisions aimed at ensuring that data subjects
are informed of their rights regarding data protection. The following are some of the more relevant
articles of the directive:
Article 6(1)(a) requires that personal data
be processed fairly and lawfully;
Article 7 sets out a number of conditions that
need to be complied with before personal
data can be legally processed;17
Article 8 prohibits the processing of data
which reveals a data subjects ethnicity, religious beliefs, political beliefs, membership
of a trade union, criminal record, health, and
sex life (unless exemptions exist);
Article 10 outlines the minimum information
that must be provided to the data subject in
cases when the data is collected directly
from him or her;
Article 11 outlines the minimum information
that must be provided to the data subject in
cases when data about him or her is collected
from a third party; and
Article 14 requires that data subjects are
informed before personal data is disclosed
to third parties.18



InternAtIonAl dAtA trAnsfers


The Data Protection Directive 95/46/EC introduced the principle that member states should
transfer data outside of the EU to a non member
states only where adequate protection is provided.
The European Commission plays a particularly
important role in terms of international data transfers. The commission decides whether a country
outside the EU offers an adequate level of data
protection and approves the contracts for data
transfers. Article 25 of the directive empowers
member states and the commission to determine
the level of legal protection in another state.
Importantly, Article 25 of the directive also sets
out clearly the criteria for assessing the adequacy
of protection in a country outside the EU.19 Under
Article 25 of the directive it is necessary to make
a decision in the light of all the circumstances surrounding a data transfer operation. Article 25(2)
requires particular consideration to be given to
the nature of the data, the purpose and duration
of the proposed processing operation, and the
country of origin and country of final destination. The rules of law, both general and sectoral,
in force in the third country in question and the
professional rules and security measures which
are complied with in that country are also required
to be considered under Article 25(2).
When the directive was first published there
were concerns that the restrictions on the flow of
data to countries outside of the European Union
would restrict the development of an information
superhighway. However, it is now generally accepted that the safeguards on data flows outside
of the EU have not restricted unnecessarily data
flows. Indeed, the transposition of the directive
into the laws of the member states of the EU has
encouraged countries outside of the European
Union to adopt best data protection practice and
to ensure the operation of adequate data protection laws in their country.

The Legal Framework for Data and Consumer Protection in Europe

Nevertheless, there still is much criticism of


the European Data Protection Directive by businesses, suggesting that the directive does not
facilitate business and the needs of e-commerce.
In the Lindqvist decision, the ECJ took a restrictive view of what constituted a transfer of data
outside the EU. The ECJ held in that case that the
placing of an item on a Web site did not amount
to a transfer, even though anybody could access
the Web site internationally. This decision has
been the focus of much criticism.

ImplementAtIon of the dAtA


protectIon dIrectIve 95/46/ec
As already discussed, it is the responsibility of
member states to give legal effect to the Data Protection Directive 1995. The significant provisions
of the directives are, however, contained in the
law of the individual EU member states. Problems
have inevitably arisen in the transposition of the
Data Protection Directive into the domestic law
of the different member states of the European
Union.20 A further difficulty arises in that many
aspects of data protection laws are derived from
administrative practices and informal customs
which are not in written form (Kuner, 2007). This
is the case with the implementation of the Data
Protection Directive. In particular, the different
enforcement practices in different member states
have given rise to confusion and legal uncertainty
from one EU member state to another.21
The European Commission published a report
on the implementation of the Data Protection
Directive 1995. This report also sets out a work
programme for the better implementation of the
Data Protection Directive. It is clear from this
report that levels of effective enforcement vary
widely in the EU, particularly in the some of the
newer member states. The most recent Flash Eurobarometer 2003 Survey of company practices
indicated clearly that compliance with the current
information requirements is problematic. The

survey demonstrated that there is not constant


compliance with data protection legislation. For
example, the survey reported that companies do
not provide individuals with the information to
which they are legally entitled. Research has
demonstrated that only 37% of companies reported
that they systematically provided data subjects
with the identity of the data controller and only
46% said they always informed data subjects of
the purposes for which the data would be used (see
Article 29, Working Party Opinion). However, the
Eurobarometer Survey indicated that larger companies were more likely to provide the required
information than smaller businesses.
The main thrust of the most recent European
Commission report on the implementation of
the Data Protection Directive demonstrates that
EU law is achieving its main aims. In particular,
the report suggested that the 1995 Data Protection Directive has broadly achieved the aim of
ensuring strong protection for privacy while
making it easier for personal data to be moved
around the EU. The report did point up the late
implementation by member states of the directive.
The report was critical also of differences in the
ways the directive is applied at national level.
The European Commission suggested that this
restricted the European economy from benefiting
fully from the directive.

dAtA protectIon AgencIes In


europe
Most countries in the European Union have created
an office charged with ensuring compliance with
data protection laws. For example, the information
commissioners office is the data protection body
for the United Kingdom, while the data protection commissioners office is the relevant body in
Ireland.22 Complaints, problems, and queries in
relation to data protection are increasingly being
directed to these bodies. As the Council of Europe
has noted, these bodies play an integral part of



The Legal Framework for Data and Consumer Protection in Europe

the control system in a democratic society. Complaints to these data protection agencies will result
in an investigation into the complaint and action
by the agency to address the complaint.
These data protection agencies maintain a
register which provides general information about
data handling practices of many important data
controllers. Organisations such as government
departments, financial institutions, and organisations who maintain sensitive types of personal
data are included on this register.
These data protection agencies have an important role in protecting consumers from excessive
collection of personal data. For example, the Irish
data protection commissioner publishes case
studies of where his office takes action against
different institutions, including financial institutions for collection of excessive amounts of
personal data. These case studies are illustrative
and easily understandable and are a good source
of information for people who have concerns.
As already mentioned, Article 29 the Data
Protection Directive 1995 established a working
party. Under Article 29(2) the membership of
the working party is made up of the data protection commissioners from the different European
Union member states along with a representative
of the European Commission. The involvement
of these data protection agencies in the working
party means that problems encountered by these
agencies in monitoring the application and operation of the data protection laws are formally
filtered back to the European Commission. The
working party advises the European Commission on the levels of data protection measures in
place in countries outside the European Union.
The European Commission benefits from the advice of these European data protection agencies.
This is beneficial as these bodies often deal with
complaints regarding the processing of personal
information outside of the European Union.
These data protection agencies have demonstrated their ability to ensure compliance with
the law. Recent examples of this include a raid by

8

the Irish data protection commissioners office on


businesses involved in the cell phone marketing
sector. The commissioner took action following a
large amount of complaints from members of the
public. The data protection commissioner seized
large volumes of documents from these companies with a view to prosecution of companies that
sent unsolicited communications to subscribers
and who failed to comply with their obligations
regarding privacy.
The information commissioner in the United
Kingdom has been also proactive according to
its annual report and is increasingly taking legal
action against businesses who fail to comply with
consumer and data protection legislation. Section
55 of the UKs Data Protection Act 1998 makes
it an offence to obtain, disclose, or procure the
disclosure of personal information knowingly or
recklessly, without the consent of the organisation holding the information. Under the Data
Protection Act 1998, a conviction under Section
55 of the Act can result in a fine of up to 5000
in the magistrates court and an unlimited fine
in the crown court. The information commissioner in 2006 recommended that the penalty
for prosecution of the offence under Section 55
of the Data Protection Act 1998 be amended.
The amendment proposed by the commissioner
would see the introduction of a prison sentence
on indictment of up to a maximum of 2 years or
a fine, or both. A conviction or a prison sentence
of up to 6 months or a fine, or both for a summary conviction was also recommended by the
commissioner.23 Such a change in the law would
be welcome and would provide the UK information commissioner with real teeth in terms of
pursuing persons and organisations who fail to
comply with the United Kingdom data protection
laws. Moreover, the down to business approach of
the UK commissioner in seeking stronger legal
penalties for breach of the data protection laws
demonstrates a tenacity in the approach of the
commissioners office in promoting a culture of
respect and legal compliance with the processing
of personal data.

The Legal Framework for Data and Consumer Protection in Europe

It is clear from the reports from the data


protection agencies that there is a significant illegal trade in confidential personal information.
As the UK information commissioner suggests,
this demonstrates the need for a strong deterrent
and greater attentiveness to the law control by
organisations. The action taken by the Irish and
UK data protection bodies sends a strong signal
to businesses that violate the data protection laws.
The call by the UK information commissioner
also demonstrates the need for European Union
member states to review their laws to ensure
that the criminal sanctions for breach of the data
protection law are sufficient to deter commercial
entities (and other data controllers) for illegally
processing personal data.

prIvAcy notIces
Privacy statements inform online consumers how
their personal information will be used by the company from which they purchase goods and services
and it thus an important source of protection.24
However, there has been much criticism about the
effectiveness of these privacy notices, particularly
as these notices are presented in a convoluted way,
using language which is unnecessarily technical
and legal. This issue has been raised by Consumers
International in a study published in 2007. The
Article 29 Data Protection Working Party (which
was set up under Article 29 of the Data Protection
Directive. The working party is an independent
EU advisory body, whose remit is in the area of
data protection and privacy) issued direction on
corporate privacy notices in 2004. This concept
of multi layered privacy notices is important
as such notices can effect improvements in the
quality of information received by consumers on
data protection. It does this, as the working party
pointed out, by focusing on the information that
the individual needs to understand their position
and to make decisions. The working party called
for layered and easily readable privacy statements

on Web sites. While this direction is not legally


binding upon businesses, it has resulted in many
businesses in Europe adopting the best practice
suggested by the working party.
Among the principles set out by the working
party include the suggestion that information
provided to data subjects should use language
and layout that is easy to understand. The concept
of comprehension is important because as the
working party point out as it will allow a data
subject to make informed decisions and have
the knowledge and understanding to influence
the practices of data controllers and processors.
The working party also pointed out that it was
important to ensure that the information is also
given in an appropriate manner, so that it meets
the needs of children for example.
There is no obligation on any member states
of the European Union to realise these standards
in their domestic laws. The guidance on privacy
notices set out by the working party if set in law
certainly could play a role in better informing
consumers about how their personal data will
be used.

consumer protectIons
As Fielder (2002) pointed out, despite the creation
of data protection laws throughout the EU through
the Data Protection Directive 1995, research demonstrates that there is still widespread neglect
of good privacy practice and lack of compliance
with data protection legislation. This has obvious and serious implications for consumers, and
is particularly concerning due to the continuing
growth and availability of more sophisticated data
collection technology.
In 2004 the European Commission published a
report entitled Commission Staff Working Document, Consumer Confidence in E-Commerce:
lessons learned from the e-confidence initiative
Brussels which examined consumer confidence
in e-commerce. The report identified that com-

9

The Legal Framework for Data and Consumer Protection in Europe

petitiveness within the European economy is a


key priority for the European Union and that
consumer protection helps to ensure competitiveness in the European economy. This is important
from a consumer protection point of view as the
creation of good consumer protection rules and
systems have been given attention by the European
Commission, which aims to strengthen consumer
confidence in the functioning of the market
(Report, European Commission, 2004).
Consumers who lack confidence in the workings of the market and who lack confidence in how
they are protected in their member state will be
unwilling to make significant purchases in other
EU states. As e-commerce is particularly affected
by lack of confidence, there is a need for national
governments and the European Commission to
promote consumer confidence by ensuring that
the current laws on data protection are effective
and by responding to emerging challenges posed
by technological innovation.
The Council of Europe created Data Protection Day on the January 28, which has been
supported by the European Commission. The date
for Data Protection Day was chosen to mark
the date of signature of the Convention for the
Protection of Individuals with regard to Automatic
Processing of Personal Data. This event will be
held on a European wide basis with the aim of
informing people about their personal data rights.
This is an important type of initiative, which can
play an important role in empowering citizens
of the European Union and informing them as
to their rights regarding the use of their personal
data. As the vice-president of the European Commission noted, it is of paramount importance that
EU citizens are aware of their rights as every time
they surf the Internet, make travel arrangements,
use their credit card, and receive medical treatment they supply personal information which
if misused could result in a serious invasion of
privacy (Frattini, 2007).

0

future developments
It is clear that data protection agencies have an
important role to play in ensuring that the data
protection legislation achieves its goals. In this regard, the publication of research which highlights
international difficulties with data protection is
paramount. Of course this will be contingent upon
adequate funding which will permit these agencies
to produce this work. Consumer literacy in terms
of data protection will also form a key component
in realising the data protection principles set out
in the Data Protection Directive 1995.
An important component of ensuring the
protection of personal data is the availability of
adequate training for businesses and data holders.
It is essential that businesses know the law and are
provided with incentives for protecting personal
data and processing the data in accordance with
the law. It is important also that businesses are
provided with model privacy policies. Policies
which are concise and readily understandable
could be effective tools in promoting a culture
of respect and further compliance with the data
protection laws through out Europe.
Significant also will be the increased availability of fast and effective redress for consumers
and data subjects where there is infringement of
their rights under the data protection laws. These
issues require consideration and action by the
European Commission and national European
governments.

sub conclusIon
Recent years have seen a rise in concern with how
personal data is handled. These concerns have
resulted in the introduction of data protection laws
which aim at guaranteeing that personal data is
handled appropriately. However, the creation of
these legal rights and legal protections will only

The Legal Framework for Data and Consumer Protection in Europe

stem the misuse of personal data if people know


about the law and their rights and know how to
access legal protections. The European Commission has been somewhat proactive in promoting
knowledge about data protection and the rights
of data users. However, it is clear that more needs
to be done to ensure that citizens of the European
Union are equipped with the necessary knowledge
to ensure that their personal data is treated with
respect and in accordance with law.
The free flow of personal information throughout the European Union is crucial to almost all
forms of economic activity in the European Union.
The challenge for the European Commission,
national governments, consumer agencies, and
national data protection agencies is to keep the law
and technological innovations under review and
respond to threats to the privacy of consumers. To
a large extent the European Commissions proactive approach has been successful in monitoring
the EU legislation and engaging with consumers
and interest groups. However, as research has
demonstrated, there is still the hugely important
issue of compliance with the directive, which
requires much work from some EU states. The
Data Protection Directive has been successful in
promoting the easier flow of information within
the European Union. However, the focus now
needs to be on ensuring greater compliance with
the law, particularly from businesses who have
benefited from the free flow of data.

conclusIon
The legal frameworks that have been constructed
to protect consumers and their privacy are not
perfect, many gaps, deficiencies, and short comings still exist. However, the legal framework
constructed does provide the foundation for the
development of laws in Europe, which will be
responsive and effective in protecting consumers from unethical processing and use of their
personal data and from exploitative business

practices. The European Commission and European governments need to keep the law under
review to ensure that it responds and evolves to
protect consumers. Consumer rights agencies and
data protection agencies will increasingly have a
role to play in ensuring that consumers and data
subjects are aware of their legal rights and are
empowered to assert them.

references
Anassutzi, M. (2002). E-commerce directive
00/31. International Company and Commercial
Law Review, 13(9), 337-342.
Brazell, L. (2004). Electronic signatures law and
regulation (1st ed.). London: Sweet & Maxwell.
Colgan, N. (2003) Ireland: Electronic commerce
directiveimplementation into Irish law. International Trade Law and Regulation, 9(2).
Corbett, R. (1993). The treaty of Maastricht.
London: Longman.
Craig, P., & de Brca, G. (2007) EU law text
cases & materials (4th ed.). Oxford, UK: Oxford
University Press.
Ellis, H. (2004). Modern Irish commercial and
consumer law. London: Jordan Publishing.
Fielder, A. (2002). Better compliance: guidance,
enforcement & self-regulation. Paper presented
at the Data Protection Conference and Report on
the implementation of Directive 95/46/EC, 2002.
Retrieved October 13, 2007, from http://ec.europa.
eu/justice_home/fsj/privacy/docs/lawreport/
fielder_en.pdf
Gillies, L. (2001). A review of the new jurisdiction
rules for electronic consumer contracts within the
European Union. Journal of Information Law and
Technology (1).
Hornle, J. (2005). Country of origin regulation in
cross-border media: one step beyond the freedom



The Legal Framework for Data and Consumer Protection in Europe

to provide services? International and Comparative Law Quarterly, 54, 89-126.


Johnson, D., & Post, D. (1996). Law and bordersthe rise of law in cyberspace. 48 Stanford
Law Review, 1367.
Keller & Murray, (1999). IT law in the European
Union. London: Sweet and Maxwell.
Kryczka, K. (2004). Ready to join the EU information society? Implementation of e-commerce
directive 2000/31/EC in the EU acceding countriesthe example of Poland. International Journal of Law & Information Technology, 12, 55.
Kuner, C. (2007). European data protection law:
Corporate regulation and compliance. USA:
Oxford University Press.
Moerel, L. (2001). The country of origin principle
in the e-commerce directive: the expected one
stop shop. Computer and Telecommunications
Law Review, 7(7), 184-190.
Motion, P. (2001). The Brussels regulation and
e-commerce a premature solution to a fictional
problem. Computer and Telecommunications Law
Review, 7(8), 209-215.
Mowbray, A. (2007). Cases and materials on the
European convention on human rights (4th ed.).
Oxford University Press.
Oren, J. S. T. (2003). International jurisdiction over
consumer contracts in e-Europe. International
and Comparative Law Quarterly, 52, 665.
Roebuck, W. (2002). Jurisdiction and e-commerce.
Computer and Telecommunications Law Review,
8(2), 29-32.
Rule, C. (2002). Online dispute resolution for
business. San Francisco: Jossey-Bass.
Schellekens, M. H. M. (2004). Privacy and electronic signatures: are they compatible? Computer
and Telecommunications Law Review, 10(7),
182-186.



Wahab, M. (2004). Globalisation and ODR:


dynamics of change in e-commerce dispute
settlement. International Journal of Law and
Information Technology, 12(1), 123-152.
Walden, I. (2001). Regulating e-commerce:
Europe in the global e-conomy. European Law
Review, 26(6), 529-547.
Wallis, D. (1999) Report on the proposal for a
council regulation on jurisdiction and the recognition and enforcement of judgements in civil
and commercial matters. Committee on Legal
Affairs in the Internal Market. COM (1999) 348
- C5-0169/1999 - 1999/0154 (CNS).
White, R. C. A., & Ovey, C. (2006). Jacobs and
White: The European convention on human rights
(4th ed.). Oxford, UK: Oxford University Press.
Winn, J. K., & Wright, B. (2001). The law of
electronic commerce. Aspen Publishers.
Zekios, Dr., G. I. (2007). State cyberspace and
personal cyberspace jurisdiction. International
Journal of Law and Information Technology,
15, 1-37.

press releAses And


publIcAtIons
Commission Staff Working Document, Consumer
Confidence in E-Commerce: lessons learned from
the e-confidence initiative Brussels, 8.11.2004,
SEC (2004) 1390. Retrieved October 13, 2007,
from http://ec.europa.eu/consumers/cons_int/ecommerce/e-conf_working_doc.pdf
Consumers International, Credibility on the
web: An international study of the credibility
of consumer information on the internet. Retrieved October 13, 2007, from http://www.
consumersinternational.org/Shared_ASP_Files/
UploadedFiles/205F49EB-D048-43B0-A2B09596B2287BA5_Doc320.pdf

The Legal Framework for Data and Consumer Protection in Europe

The Convention on jurisdiction and the enforcement of judgments in civil and commercial matters, signed at Brussels, 27 September 1968, OJ
L299/32 1968. Retrieved 13 October, 2007, from
http://eur-lex.europa.eu/LexUriServ/LexUriServ.
do?uri=CELEX:41968A0927(01):EN:HTML
Council Regulation (EC) No 44/2001 of 22
Dec 2000 on jurisdiction and the recognition
and enforcement of judgments in civil and commercial matters. Also known as the Brussels
1 Regulation. Retrieved October, 13 2007, from
http://eur-lex.europa.eu/LexUriServ/LexUriServ.
do?uri=CELEX:32001R0044:EN:HTML
Directive 2000/31/EC. Retrieved October 11, 2007,
from http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:EN:NOT
Directives 98/34/EC and 98/84/EC.
European Commission, Directorate General
Press and Communication (2002). Towards a
knowledge-based Europethe European Union
and the information society. Retrieved October
13, 2007, from http://ec.europa.eu/publications/
booklets/move/36/en.pdf
European Commission, The Special Eurobarometer Survey on Data Protection. Retrieved October
13, 2007, from http://ec.europa.eu/public_opinion/archives/ebs/ebs_196_en.pdf
First Report on the Implementation of the Data
Protection Directive (95/46/EC), COM(2003) 265
final (Brussels, May 15, 2003). Retrieved October 13, 2007, from http://eurlex.europa.eu/LexUriServ/site/en/com/2003/com2003_0265en01.
pdf
IP/97/313Date:16/04/1997. Retrieved October 13,
2007, from http://europa.eu/rapid/pressReleasesAction.do?reference=IP/97/format=HTML&age
d=1&language=EN&guiLanguage=en
OECD Conference to Examine Alternative Dispute Resolution Mechanisms for On-Line Commerce. The Hague, 11-12 December 2000.

Press Release Data Protection Commissioner


conducts raids on the Mobile Phone Text Marketing Sector. Retrieved August 20, 2007, from
http://www.dataprotection.ie/documents/press/
PR200707.doc
Statement from Vice-President Frattini, on behalf
of the European Commission, on the occasion of
Data Protection Day (28 January). Retrieved October 13, 2007, from http://europa.eu/rapid/pressReleasesAction.do?reference=IP/07/format=HT
ML&aged=0&language=EN&guiLanguage=en
(TVWF) Directive (89/552/EEC).
What Price Privacy Now 2006, Information
Commissioners Office. Retrieved October 13,
2007, from http://www.ico.gov.uk/upload/documents/library/corporate/research_and_reports/
ico-wppnow-0602.pdf

sources of lAW
Data Protection Act 1998 (UK).
Data Protection Directive 95/46/EC.
Directive 2002/58/EC
Electronic Commerce Act, Number 27 of 2000
(Ireland).
European Communities (Directive 2000/31/EC)
Regulations 2003 S.I. No. 68 of 2003 (Ireland).
European Convention on Human Rights and Additional Protocols is available from the Council of
Europes Web site. Retrieved October 13, 2007,
from http://www.echr.coe.int/NR/rdonlyres/
D5CC24A7-DC13-4318-B457-5C9014916D7A/0/
EnglishAnglais.pdf
OECD Guidelines on the Protection of Privacy and
Transborder Flows of Personal Data. Retrieved
October 13, 2007, from http://www.oecd.org/
document/18/0,2340,en_2649_34255_1815186_
1_1_1_1,00.html



The Legal Framework for Data and Consumer Protection in Europe

AddItIonAl reAdIng

endnotes

Brner, F., & Spindler, G. (2002). E-commerce


law in Europe and the USA. Europe: Springer.

Bygrave, L. (2002). Data protection lawapproaching its rationale, logic and Limits.
Springer.
Carey, P. (2007). Data protection: A practical
guide to UK and EU law. Oxford University
Press.

Chissick, M., & Kelman, A. (2002). Electronic


commerce: Law and practice. Sweet & Maxwell.
Delta, G. B., & Matsuura, J. H. (2001). Law of the
internet. USA: Aspen Publishers Online.
Doukidis, G. I., Mylonopoulos, N., & Pouloudi,
N. (2004). Social and economic transformation
in the digital era. Idea Group Inc (IGI).
Lawrence, A. (2003). The law of e-commerce.
LexisNexis.
Nugter, A. (1990) Transborder flow of personal
data within the EC. Springer.
Plotkin, M. E., Wimmer, K. A., & Wells, B.
(2003). E-commerce law & business. USA: Aspen
Publishers Online.
Reed, C., Angel, J. (2007). Computer law the law
and regulation of information technology (6th ed.).
Oxford University Press.

Schultz, A. (2006). Legal aspects of an e-commerce transaction. In Proceedings of the International Conference in Europe. Sellier: European
Law Publishers.
Singleton, S. (2003). E-commerce: A practical
guide to the law. UK: Gower.
Stein, S. D., (2003). Law on the web: A guide
for students and practitioners. USA: Pearson
Education.



Charles O Mahony B.A, LL.B (NUI),


LL.M (Lond), LL.M (NUI) is a legal researcher for the Law Reform Commission
of Ireland and is a senior tutor in law at
University College Dublin.
*
Philip Flaherty BCL, LL.M (NUI), Diop
sa Gh (NUI), is a legal researcher for the
Law Reform Commission of Ireland.
The European Union is a body of democratic
European countries that work together. The
European Union is a very unique organisation in that it is not a state or an organisation.
The member states of the EU have common
institutions to which the member states delegate some of their sovereignty. This is done
to facilitate decisions on specific issues can
be made. At the moment, the EU embraces
27 countries and 490 million people. More
information about the organisation and
structure of the EU can be obtained from
http://www.europa.eu.
For additional information see the European
Commissions Web site at: http://ec.europa.
eu/consumers/cons_int/e-commerce/index_en.htm.
For more information on the evolution of the
European Union see Corbett, R. (1993). The
treaty of Maastricht. London: Longman.
The ambit of this directive is significant
it includes online information services
such as online newspapers, online selling
of products and services (books, financial
services and travel services), online advertising, professional services (lawyers, doctors,
estate agents), entertainment services, and
basic intermediary services (access to the
Internet and transmission and hosting of
information).
For a general overview of the law on data
protection see Kuner, C. (2007). European
data protection law: Corporate regulation
and compliance. USA: Oxford University
Press.

The Legal Framework for Data and Consumer Protection in Europe

10

11

The European Commission has a crucial role


in terns of data protection in Europe. The
commission is the only institution within
the EU that proposes legislation (often in
the form of directives) and the commission
is also the institution that is responsible for
monitoring the implementation of the data
protection directives. Importantly, the commission also takes action against member
states when they fail to properly transpose
directives such as the Data Protection Directive 1995 into the domestic law of their
country.
This chapter is not going to consider the
differences between the United States and
Europe in their approach to data protection. It
is sufficient to say here that there the U.S. and
the EU have adopted different approaches
to data protection. The United States has
adopted a type of sectoral approach to data
protection, insofar as a mixture of legislation and self regulation govern the area of
data protection. This approach contrasts
with the EU approach embodied in the Data
Protection Directive 95/46/EC, which is a
comprehensive piece of legislation aimed at
protecting privacy of personal data.
95/46/EC. This chapter is not going to
deal with the Directive on Data Retention,
EC/2002/58, which was introduced in response to concerns with terrorism. Many
commentators have been extremely critical of the directive, which was ultimately a
rushed piece of law making. In terms of its
contribution to data protection laws in the
European Union, the directive has served to
add more confusion. In particular, the 2002
Directive adds complexity to difficult issues
such as jurisdiction and the applicability of
the law.
This is a pan European organisation with
47 different member countries; there are 5
observer countries which include the Holy
See, the United States, Canada, Japan, and

12

13

14

15

16

17

Mexico. More information on the Council


of Europe is available from the councils
Web site at: www.coe.int.
This text of the European Convention
on Human Rights and Additional Protocols is available from the Council of
Europes Web site. Retrieved October 13,
2007, from http://www.echr.coe.int/NR/
rdonlyres/D5CC24A7-DC13-4318-B4575C9014916D7A/0/EnglishAnglais.pdf
Article 10 of the ECHR provides for the
fundamental right to freedom of expression.
This right includes explicitly the freedom
to receive and impart information and ideas
without interference by public authority
and regardless of frontiers. This freedom
to receive information under Article 10 is
considered to imply freedom in seeking
information. Articles 8 and 10 of the convention complement each other in protecting the
privacy of Europeans but also their freedom
of expression.
For a general discussion on the European
Court of Human Rights and Article 8 see:
Mowbray, A. (2nd.ed., 2007). Cases and
materials on The European convention on
human rights, Oxford University Press.
White, R. C. A., & Ovey, C. (4th ed., 2006).
Jacobs and White: The European convention on human rights, Oxford, UK: Oxford
University Press.
The text of the convention is available from
the Council of Europes Web site. Retrieved
October 13, 2007, from http://conventions.
coe.int/Treaty/EN/Treaties/Html/108.htm
See the Web site of the Council of Europe. Retrieved October 13, 2007, from http://www.
coe.int/t/e/legal_affairs/legal_co-operation/
data_protection/background/1Background.
asp#TopOfPage.%20%20Retrieved%2020/
08/2007
The conditions include:

Where the data subject has unambiguously given his or her consent for the
processing;


The Legal Framework for Data and Consumer Protection in Europe

18

19



Where the processing is necessary for


the performance of a contract to which
the data subject is party;

Where the processing is necessary for


compliance with a legal obligation;

Where the processing is necessary in


order to protect the vital interests of
the data subject;

Where the processing is necessary for


the performance of a task carried out
in the public interest; and

Where the processing is necessary for


the purposes of the legitimate interests
pursued by the controller or by the third
party or parties to whom the data are
disclosed (with the exception of where
such interests are overridden by the
interests for fundamental rights and
freedoms of the data subject).
Importantly, Article 14 of the directive specifically requires member states to take the
necessary measures in the domestic law of
their country to ensure that data subjects are
aware of the existence of the right. This is
a positive obligation on states that requires
them to make consumers aware of their
rights with regards to the direct marketing
provision.
It is important to note that exceptions are
contained under Article 26 of the directive,

20

21

22

23

24

where for example, a data subject gives


consent for the transfer.
Problems have arisen as different member
states have their own language and separate
legal tradition. This gives rise to problems in
transposing the Directive into the national
laws of the member states of the EU.
The combination of civil, administrative, and
criminal punishments and the differences in
the importance of administrative bodies and
private enforcement of the law, in particular
further this confusion and uncertainty.
Independent data protection authorities
now function in 27 member states of the
newly enlarged European Union. See for
example, the United Kingdom information
commissioners office Web site at www.
ico.gov.uk and the Irish data protection
commissioners Web site at www.dataprotection.ie (Retrieved October 13, 2007).
In England and Wales summary convictions
are less serious offences and are triable in the
magistrates courts. More serious offences
are triable only on indictment in the crown
court.
For example, information as to whether personal information will be sold to third parties
is contained in these privacy notices.



Chapter XVIII

Cybermedicine, Telemedicine,
and Data Protection in the
United States
Karin Mika
Cleveland State University, USA
Barbara J. Tyler
Cleveland State University, USA

AbstrAct
This chapter provides an overview of law relating to online and Internet medical practice, data protection, and consumer information privacy. It provides a comprehensive overview of federal (HIPAA) and
state privacy laws, concluding that both those legal resources leave gaps in consumer protection and
provide no real penalties for violating the laws. The authors educate the readers to the legal and data
protection problems consumers will encounter in purchasing medical and health services on the Internet.
Furthermore, the authors recount some actual case studies and follow those with expert advice for those
Internet consumers who wish to be not merely informed, but also safe. The authors not only educate
the readers to the lack of protection afforded to them but also advocate throughout the chapter that the
United States must enact more federal protection for the consumer in order to deter privacy violations
and punish criminal, negligent, and wilful violations of personal consumer privacy.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Cybermedicine, Telemedicine, and Data Protection in the United States

IntroductIon
The practice of medicine is not immune from
the information age. The use of the Internet,
including e-mail, in medical practice is altering
the traditional method of delivering medical care.
Millions of Americans now rely upon the Internet
as a primary source of medical information or
education about their own symptoms, conditions, diagnoses, and treatments. The practice of
telemedicine, consulting with another physician
by using technology, is constantly evolving and
expanding into areas never before imagined.
Physicians are establishing their own Web sites
and some few are now practicing medicine on
the Internet.
The progression of the traditional practice
of medicine in cyberspace has brought with it
many issues related to privacy and online data
protection. No longer is the physician-patient
relationship limited to an in-person office consultation that carries with it the legal protections
of doctor-patient privilege. Rather, the practice
of medicine has evolved to include interactions
that might not have ordinarily been considered
a physician-patient relationship, and these
contacts may stretch across both real and virtual
boundaries. In fact, the interactions are, at times,
both real and virtual, and the consumer-patient is
now in a situation where it is difficult to identify
exactly who is the party on the other end.
This chapter will provide an overview of the
law relating to cybermedicine, medicine practiced without traditional in-person contact, and
telemedicine, in terms of data protection and
other legal complications related to licensing
and a conflict of state laws. The chapter will
examine the laws applicable to Web sites where
medical diagnosis or the purchase of medical
services (including prescriptions) is available. The
chapter will discuss how the new methodology of
acquiring medical care is at odds with traditional
notions of state regulation and how current laws,
both federal and state, leave many gaps related to

8

any consumer protections or potential causes of


action when privacy is compromised.
This chapter will proceed with an overview
of the federal Health Insurance Portability
and Accountability Act of 1996 (HIPAA), an
act promulgated to ensure privacy of health information as well as access to health care. It will
review HIPAAs application to medical practice
conducted on the Internet. It will, in brief, discuss
the plethora of sites available over which American
citizens may purchase prescription drugs without a prescription from a licensed United States
physician or merely through an overseas Web site
with no physician to monitor the transaction. We
then will examine current federal laws which are
not set up to regulate these international transactions. The chapter will explore potential legal
complications with personal data and privacy
issues related to purchasing medical treatment
or services on the Internet and describe what, if
any legal recourse consumers might have when
the outcome of an Internet medical transaction
turns out to be undesirable. The chapter will posit
some expert advice for consumers regarding using
websites for medical purposes as well as protecting
their own privacy. Lastly, this chapter advocates
a federal law more punitive that HIPAA; one
that regulates and protects patient information,
medical transactions, and interactions on the
Internet and deters violations of patient privacy
by mandating significant fines and imprisonment
for negligent or criminal and willful violations
of that privacy.

the trAdItIonAl prActIce of


medIcIne In the unIted stAtes
physicians state licensure
The study and practice of medicine is core to
mankinds need to extend and preserve life. The
evolution of medical practice over the centuries
from prehistoric times has involved the study

Cybermedicine, Telemedicine, and Data Protection in the United States

of the human body and its mechanisms, disease


processes, surgery, and natural herbal and manufactured drugs, to the present time, with its focus
on modern genetic testing and techniques. While
modern medicine and medical education had its
genesis in the 19th century, the 20th century has
made incomparable strides in clinical practice,
immunology, and pharmacology.
Practicing medicine is not regarded by the
law as an inherent right of an individual. Rather,
it is regarded as a privilege that is granted by the
people in a state acting through their elected representatives. Because it is considered a privilege
to practice medicine, each state protects health
care consumers by licensing and regulating
physicians. As a report to Congress stated, The
purpose of licensing health care professionals is to
protect the public from incompetent or impaired
practitioners (Telemedical Report to Congress,
1996). Licensure authority defines who has the
legal responsibility to grant health professionals
the permission to practice their profession. Physicians, dentists, registered nurses, and pharmacists
are subject to mandatory licensing in all 50 states.
Many other types of health care professionals
are also licensed in most states. The federal
government licenses some individual health care
providers, for example, those professionals who
manufacture, distribute, prescribe, and dispense
controlled substances must be registered with
the Drug Enforcement Administration (Miller,
2006).
Historically, under the Tenth Amendment to
the United States Constitution, states have the authority to regulate activities that affect the health,
safety, and welfare of the citizens within their
borders, including the practice of medicine. The
United States Supreme Court has recognized that
states have a compelling interest in the practice
of professions within their boundaries (Goldfarb
v. Virginia State Bar, 1975).
In response to this amendment, each state
has created a Medical Practice Act that defines
the proper practice of medicine and mandates

the duties of the state medical board to regulate


medical practice. Every state and United States
territory has a medical board. The primary means
through which the state medical boards promote
sound medical practice and keep consumers safe is
through licensing and regulating of physicians.
The federal government plays little role in
setting standards for the medical field except to
the extent that the Food and Drug Administration
is responsible for determining what prescription
drugs are available, and setting safety standards
for drugs and packaging. Retail pharmacies are
highly regulated. All states require pharmacies
have a license. Some states regulate hospital
pharmacies through hospital licensing, exempting
those pharmacies from the pharmacy licensing
system. Pharmacy regulations often require staffing requirements as well that limit the hiring of
those who are unlicensed. Standards of practice
can be found in many places such as state statutes, agency regulations, and county or municipal
ordinances.
Pharmacists fall into the category of coverage
by state law as well. All states require licensing of
pharmacists. The National Association of Boards
of Pharmacy (NABP) is the independent group
that assists member boards and jurisdictions in
developing, implementing, and enforcing uniform
standards. This group provides an Electronic
Licensure Transfer Program (ELTP) listing all
50 states, the District of Columbia, Puerto Rico,
and the Virgin Islands and the requirements for
licensed pharmacists to transfer an existing license
from one state or jurisdiction to another, as well as
the fee for processing the preliminary application
(NABP Web site, 2007). The federal government,
under the Drug Enforcement Administration, licenses professionals who manufacture, prescribe,
dispense, or distribute controlled substances.

Board Certification
Traditional medical licensing has changed in
recent years to require more education for those

9

Cybermedicine, Telemedicine, and Data Protection in the United States

physicians or specialists who wish to acquire


board certification beyond the traditional state
medical license. A seachange has occurred in the
medical sciences as a direct result of exponential advances in electronics, chemistry, physics,
computer sciences, engineering, and clinical
medicine. Medical practice today requires so
much knowledge, it is virtually impossible for
any physician to become an expert in every field.
Thus, many current clinical practices require
advanced, specifically focused knowledge and
extra years of training and education in specific
fields of study.
According to the American Board of Medical Specialties (ABMS), there are more than 130
medical specialties and subspecialties. There
are 24 member boards listed as members of the
ABMS. If a physician is board certified it means
that the physician has not merely completed medical school and holds a valid state license but has
also completed an appropriate residency from 3
to 7 years and been examined using the criteria
informing the physicians specific field.
As medicine and its practice become infinitely
more complicated to practice, the licensure system
within the United States discourages interstate
practice (Johnson, 2006). There are some exceptions to requiring an actual physicians state
license in each state: one such exception is the
consulting exception. Physicians may practice
medicine in another state by acting in consultation with a state-licensed referring physician.
Even this exception varies from state to state and
this exception preceded the advent of the practice
of telemedicine. To remedy this discouraging
of interstate practice, specialists who complete
residencies should be considered for federal
licensing. This licensing would allow the best
and brightest to practice within the United States
in all jurisdictions, regardless of state boundaries and not be hindered by local restrictions on
consulting and practice.

0

establishing the physician-patient


relationship
Whether express or implied, the traditional physician-patient relationship is contractual in nature
and grounded in the premise that the physician
is a learned individual, skilled and experienced
in subjects about which the patient knows little
or nothing but things that are of vital interest
to the patient since they determine his health
(Miller, 2006; Kohlman, 2006). This relationship
is a fiduciary one, requiring the highest ethical
obligation from the treating physician. Generally,
the relationship may be created from an express
or implied agreement. It is an express agreement
when actual written forms are signed, the patient
agreeing to pay for services in exchange for the
performance of a specific service from the physician. In most cases, however, the agreement
is implied. The patient finds a physician or is
referred, makes an appointment, and travels to
the physicians office. When the physician accepts the patient or undertakes to treat him, and
the patient accepts the services, the relationship
is created. Generally, a physician or other independent practitioner has the right to accept or
decline to establish a professional relationship
with any person (Miller, 2006).
Even though the existence of a physician-patient relationship usually depends upon whether
a physician has examined, diagnosed, and treated
a patient, the relationship must first be consensual
for the purposes of doctor-patient privity. The
relationship is considered consensual when the
patient knowingly seeks the services of the physician and the physician knowingly accepts treatment of the patient (Kohlman, 2006). Certainly,
such a relationship exists when a patient makes
an appointment with and sees a practitioner in
her office. In addition, physicians and surgeons
on a hospital staff enter into a physician-patient
relationship with every patient that they treat in
the hospital, whether the patient has been admitted

Cybermedicine, Telemedicine, and Data Protection in the United States

for emergency treatment or is even conscious or


able to consent. Once the relationship is established, it is a fiduciary relationship in which mutual trust and confidence are absolutely essential.
The practitioner incurs a duty of due care to the
patient that is always measured by a professional
standard for rendering professional servicesone
that is usually monitored by the state in which the
physician is licensed (Johnson, 2006).

Informed consent
In general, consent for most treatment must be an
informed consent. This type of consent means
that the treating provider is required to give the
patient or decision maker several elements of
information before the decision on treatment is
made. As the American Medical Association
states so eloquently on its Web site: Informed
consent is more than simply getting a patient to
sign a written consent form. It is a process of
communication between a patient and physician
that results in the patients authorization or agreement to undergo a specific medical intervention
(AMA, 2007). To establish a cause of action based
upon lack of informed consent, the patient must
prove that a practitioner failed to disclose to the
patient the various alternatives and the reasonably
foreseeable risks and benefits involved which a
reasonable medical practitioner under similar circumstances would have disclosed (AMA, 2007).
The ethical obligation to communicate certain
information to the patient exists in statutes and
case law all 50 states.
In 2007, Medicare and Medicaid circulated
new interpretive guidelines contained in the Code
of the Federal Regulations that significantly expanded the scope and documentation of informed
consent that must be obtained by hospitals prior
to performing surgical procedures. For example,
the new Medicare/Medicaid guidelines require
that patients be informed if a practitioner, other
than the primary surgeon, would perform important parts of the procedure, even when the

person is performing under the supervision of


the primary surgeon. Additionally, where surgery
is concerned, the consent form must specify the
specific significant surgical tasks that would
be conducted by surgeons other than the primary
surgeon (42 C.F.R. 482.51). This requirement
gives more transparency to the long-held ability
of an experienced surgeon to allow a resident who
is learning to gain experience in doing surgical
procedures by requiring the patient be informed
of that fact in advance.

liability-battery and negligence


Because of the way in which the traditional
medical relationship is established, the liability
imposed for breaches of duty can be contractual in
nature or based on tort law. Tort liability is civil in
nature, not criminal, and is imposed by the common law and some statutes for injuries caused by
breaches of duty not based on the contract. Tort
liability is almost always based on fault, whether
the fault be intentional, reckless, or negligent.
The most frequent type of liability for health care
professionals and institutions is the negligent tort.
The five elements required to establish negligence
or malpractice are the following: (1) a duty, that
means what should have been done; (2) a breach
of duty, or a deviation from the required standards
of care; (3) injury to a party; (4) causation, that
means an injury directly and legally caused by
the deviation from what should have been done.
To determine negligence, it is foremost important
to be informed regarding the duty required by the
practitioner in the jurisdiction.
Once the existence of a duty is established, the
scope of the duty must be determined. This scope
is often referred to as the standard of care. The
standard of care for individual health care professionals is what a reasonably prudent health care
professional engaged in a similar practice would
have done under the circumstances (Miller, 2006).
Thus, the standard of care may differ for one who
is an internist, a pediatrician, an obstetrician, or



Cybermedicine, Telemedicine, and Data Protection in the United States

a thoracic surgeon based upon each physicians


special education and training. Board certified
and trained specialists would owe a higher duty
of care because of their advanced education,
training and skill.
When there is no informed consent or authorization for a procedure, the physician or other
practitioner can be liable for battery even if the
procedure is properly performed, beneficial, and
has no negative consequences for the patient. The
touching of the patient alone leads to the liability
(Fox v. Smith, 1992; Bommardy v. Superior Court,
1990). Early court cases provided that giving
the patient incorrect or incomplete information
about a procedure or treatment could invalidate
the consent and make the practitioner liable for a
battery (Bang v. Miller, 1958; Moser v. Stallings,
1986). The majority of jurisdictions now rule that
failure to disclose or to fully inform the patient is
a separate cause of action from a battery.

Jurisdiction, Venue, and Conflict of


laws
In the practice of traditional medicine, there has
been little question as to where a particular cause
of action might be brought against a medical practitioner or when that action might be brought. The
traditional patient/physician privilege has existed
on a personal level with a patient physically seeing a physician. That physician may not practice
medicine unless licensed in that particular state.
Thus, were there to be a cause of action brought
against the physician, the action would be brought
in the state where the relationship existed, and the
laws of that state regarding proving a cause of
action would be applied. The state would necessarily provide a statute of limitations for bringing
a malpractice or other legal action against the
physician.
There would be exceptions, however, if the
individual seeking medical attention was not from
a particular state but sought expertise medical
help or particularized treatment. Nonetheless, in



a traditional physician-patient relationship in


which each party is physically present during the
treatment phase, the theory of where the lawsuit
is brought and what laws are applied remains essentially the same. If a patient seeks treatment in
a state in which she is not domiciled, that patient
is still seeking treatment from a physician who
must adhere to the licensing requirements and
standards of the second state. Thus, it would be
those laws that would apply should the patient
be injured in the course of medical treatment or
determine to bring a cause of action at a later
date. No federal laws would apply, except to the
extent that a patient might possibly be bringing
an action based on an inappropriate action taken
by the FDA related to a controlled substance or
other type of drug (McGrath, 2005).

the IntegrAtIon of federAl


lAW Into the trAdItIonAl
prActIce of medIcIne: hIpAA
The landscape of state domination of the medical
profession changed somewhat with the enactment,
in 1996, of the federal Health Insurance Portability and Accountability Act commonly referred
to as HIPAA. The enactment of HIPAA has
also had implications concerning the burgeoning
business of providing medical advice and treatment electronically.
HIPAA was originally enacted to provide
for the portability of health care coverage for
workers who had lost jobs or were changing jobs
(Metz, 2004). Ideally, HIPAA sought to implement a more unified system of medical information storage such that medical information could
be easily transmitted electronically (Chiang &
Starren, 2002).
Because HIPAA contemplated the use of a unified electronic storage system, there was a demand
that there be provisions enacted that would ensure
the privacy of electronically transmitted material.
Thus, included within HIPAA are protections

Cybermedicine, Telemedicine, and Data Protection in the United States

covering the privacy of an individuals medical records. The privacy provisions of HIPAA
are intended to allow a patient to limit who will
have access to medical records and further provides a limitation on the internal use of sharing
information for purposes of diagnosis in that it
restricts the disclosure of health information to
the minimum amount necessary required for
the intended purpose (Schmidt, 2000).
HIPAA specifically covers health information
oral or recorded in any form or element that:
a) is created or received by a health care provider,
health plan, public health authority, employer,
life insurer, school or university, or health care
clearing house; and
b) relates to the past, present, or future, physical
or mental health condition of an individual; the
provision of health care towards an individual;
or the past, present, or future payment for the
provision of health care to an individual. (Health
Insurance Portability and Accountability Act of
1996)
HIPAA essentially applies to three specific
health entities: health care providers (such as doctors and hospitals), health care plans, and health
care clearing houses, which include such entities
as third party billing services that may be hired to
code certain medical procedures for insurance
companies (Public Welfare. 45 C.F.R. 160.102).
HIPAA also applies to employers, insurance
companies, and public agencies that deliver social
security or welfare benefits to the extent that they
work with or necessarily disseminate information related to an employees medical records (
160,164). HIPAA applies to all individual health
information that is maintained or transmitted
and includes health claims, health plan eligibility, enrollment and disenrollment, payments for
care and health plan premiums, claim status, first
injury reports, coordination of benefits, and related
transactions (Metz, 20004). Thus, the purpose

of HIPAA, to provide health care portability for


the patient while protecting privacy concerns,
has broad and far-reaching consequences and
regulations affecting the practitioner, the health
care institution, and the insuring entity.
In addition to HIPAA, federal privacy law
also has been enacted to protect the confidentiality of information concerning patients who are
referred or treated for alcoholism and drug abuse
(42 U.S.C. 290dd-2). The rules apply to any
specialized programs to treat substance abuse in
any facility that receives federal funds, including
Medicare and Medicaid.

state law and hIpAA


The United States Constitution provides that
federal law is the supreme law of the land.
Ordinarily, that clause would be interpreted to
mean that when the federal legislature has passed
a law on a particular issue, that federal law would
take precedence over state laws on similar issues.
Indeed, the enactment of HIPAA mandated that
all state-licensed medical entities and professionals be HIPAA compliant within a certain period
of time regardless of any state laws on the issue
(Privacy Rights Clearing House, 2003).
However, the Supreme Court has determined
that when there are both federal laws and state
laws covering the same issues, federal laws will
not supersede a state law if the state law provides
more protections than the federal law. It is said
that a federal law may provide a floor for the
rights of an individual while a state may provide a
ceiling (Kramer, 2007). As applied to HIPAA,
an individual is certain to enjoy the privacy protections set out in the federal statute, but may enjoy
more privacy protections if his/her home state
has enacted additional privacy requirements for
medical practitioners. Examples of state laws that
provide greater privacy protection than HIPAA
are those state laws regulating the release of a
patients mental health information and HIV/AIDS
test results (Miller, 2006). In cases in which the



Cybermedicine, Telemedicine, and Data Protection in the United States

state protections are greater, the state law trumps


any federal law.

civil liability for violations of hIpAA


Currently, HIPAA has no civil liability provisions for any violations of the statute. Thus, an
individual who has been harmed in some way
by a practitioner or entity that fails to protect
them has no legal recourse for damages under
the federal law. The law does provide criminal
consequences in the form of fines and jail time.
HIPAA provides that a person who knowinglyuses or causes to be used a unique health
identifier [e.g., names, addresses, social security
numbers], . . .obtains individually identifiable
health information relating to an individual, or
discloses individually identifiable health information to another person may:
1.
2.

3.

Be fined not more than $50,000, imprisoned


not more than 1 year, or both;
If the offense is committed under false
pretenses, be fined not more than $100,000,
imprisoned not more than 5 years, or both;
and
If the offense is committed with intent to
sell, transfer, or use individually identifiable
health information for commercial advantage, personal gain, or malicious harm, be
fined not more than $250,000, imprisoned
not more than 10 years, or both. (45 C.F.R.
160.102)

HIPAA appears to specifically limit enforcement actions may be brought by only the respective states or the secretary of health and human
services (42 U.S.C. 300gg-22(a); ODonnell
v. Blue Cross Blue Shield of Wyo., 2001). Thus,
reporting a HIPAA violation might be the best
that an individual harmed by the release of data
can do. However, some plaintiffs have attempted
to bring indirect causes of action for a HIPAA
violation by bringing a common law invasion of



privacy action against the offending party. In fact,


HIPAA does not always pre-empt state litigation
practices (Beck & Hermann, 2007). Moreover,
some states specifically allow causes of action for
the dissemination of private medical information
(Pettus v. Cole, 1996).

telemedIcIne, cybermedIcIne,
And InformAtIonAl Web sItes
the basics
When determining the complications related to
medical practice and privacy on the Internet, it is
important to understand the distinction between
the types of medical Web sites that one might
encounter, and the medical interactions that
one might have. Like the traditional in person
patient/physician relationship, some virtual
interactions between patients and physicians are
between known individuals who might actually
see each other, although far removed from
each other geographically (Harrington, 1999). In
other interactions, however, the person logging
into a website might have no idea who is on the
other end, or even if the person on the other end
is a physician. Thus, these relationships make
the definitions of patient and physician hazy
when attempting to discern the application of any
existing privacy laws to Internet or telemedical
transactions (Lewis, 2004).

telemedicine
Telemedicine is defined as the use of telecommunications technology to provide health care
services to patients who are distant from a physician or other health care provider (Granade &
Sanders, 1996). Generally, a consultant is used
to medically diagnose a patients condition via
two-way interactive television, remote sensing
equipment, and computers. Such medical practice
has advantages, not only for improving access

Cybermedicine, Telemedicine, and Data Protection in the United States

to medical specialties, but for primary care for


patients and cost reduction (Johnson, 2006). For
example, in rural Arkansas, a pregnant patient
with a family history of spina bifida, a congenital
abnormality of the spine, living in a rural area
whose obstetrician has just performed a prenatal
ultrasound, can have her test transmitted to a
specialist in maternal-fetal medicine in a large
medical center who can review the test and opine
the fetus is developing normally and immediately
reassure the patient (Johnson, 2006). The benefits
of using technology of the rich to diagnose and
treat the poor has substantially increased the
opportunities of those who would not otherwise
receive health care and allowed them to receive
specialized treatment (Tyler, 1998). Moreover,
the availability of telemedicine has enabled the
sharing of both information and expertise which
has enhanced the development of the medical
profession in general.

cybermedicine
A. Patient Treatment
An offspring of the information and technology
revolution, cybermedicine is the embodiment of
a discipline that applies the Internet to medicine.
The field uses global networking technologies to
educate, innovate and communicate in ways that
enhance and promote medical practice, rapidly
transforming medicine into a different discipline.
In many respects, cybermedicine is broadly
defined as the practice of medicine without the
necessity of any physical in-person consultation
or examination (Scott, 2001).
Cybermedicine is a relatively new phenomenon that has been around for less than 15 years.
On Friday, October 4, 1996, the news announced
the first virtual real live doctors office on the
World-Wide-Web had opened (Cyberdocs Today,
1996). On this site, for a reasonable fee of $65, a
patient could enter her name and vital statistics, her
medical problem, send the service her symptoms,

provide credit card information, and within a few


minuteswithout ever being examinedreceive
treatment and possibly a prescription. This first site
located in Massachusetts, required that the patients
either be physically in the state of Massachusetts
or traveling abroad in a foreign country. These
two location safeguards, allowed the site creators
to circumvent the problem of state licensing laws
for liability purposes (Tyler, 1998). If the patient
was traveling in a foreign country, the patient
who needed a medication refill could have one
sent by fax machine to a local overseas pharmacy
within a few hours. The reach of cybermedicine
has since extended beyond the boundaries of a
home state.

B. Prescription Drug Purchasing


An offshoot of the cybermedicine industry is
a new generation of Web sites that provide prescription drug delivery to consumers. Estimates
of the number of Internet pharmacies operating
in the United States have reached as high as 1400
(Gregg, 2007). In 1 year, one article estimates
that 135% more Web sites have offered controlled
substances such as Xanax, Vicodin, and Ritalin.
Investigators found 394 Web sites advertising
controlled prescription drugs and 7% of the sites,
or about 187, offered to sell the controlled drugs.
The vast majority of those Web sites offered to
sell the drugs without a prescription (DeNoon,
2007).
While most Internet pharmacies are legitimate businesses that offer customers an attractive
and convenient option to purchase affordable
medicine in accordance with state and federal
law, these new online pharmacies present many
problems for drug companies, consumers, and
physicians. Consumers find it very attractive to
order from the privacy of their own homes and
find that such pharmacies offer privacy and convenience for them based upon medical consultation
with a physician.



Cybermedicine, Telemedicine, and Data Protection in the United States

In some instances, the privacy aspects making many sites attractive are problematic in and of
themselves. Individuals can often order drugs by
a brief description of symptomssymptoms that
they may or may not have (FDA FAQs, 2007).
In addition, because the Internet guarantees near
anonymity, there is no way for the Web site to
tell that the person ordering on line is the person
actually on the other end of the transaction. Although this type of scenario raises the specter of
fraud, it raises other issues related to privacy.
The truth is that anyone who owns a credit card
and a computer can order controlled substances
or any other drugs online. Pharmacies that do
not require a faxed or mailed prescription from
a licensed physician present serious concerns for
not only privacy but life itself. Drugs offering help
for erectile dysfunction, Viagra, Cialis, and Levitra, rank among the top 10 drugs bought online.
Other drugs in the top 10 sought after online are
Propecia, for baldness, as well as drugs for acid
reflux, cholesterol, and bone density (Lade, 2007).
Anonymity seems to be the deciding factor in ordering such drugs online. Legitimate pharmacies
functioning online can be problematic for privacy
concerns. An FDA advisory was issued in spring
2007 after examining foreign drug purchases
and finding drugs with potentially dangerous
side effects were easily ordered online, without
a prescription. Two deadly examples of resulting
problems with online ordering of drugs follow.

McKay used does not require a faxed or mailed


prescription from a licensed physician. Instead,
the prescription site asks only that the patients fill
out an online questionnaire about their health history. McKay noted on his application that he had
moderate depression, had taken the drug before,
and was not suicidal. The doctor who wrote the
prescription had a restricted medical license and
was not allowed to prescribe drugs.
Less than 7 weeks after ordering Prozac, John
McKay killed himself. He was 19 years old. Articles reveal that neither the web prescribing doctor
nor the online pharmacy accepted any responsibility for McKays death (Ostrov, 2007; Lade,
2007). California law requires that prescriptions
must be written by a licensed California physician after a physical examination of the patient
(Ostrov). Prozac and similar antidepressants carry
the FDAs strongest warning that notes the link
between taking the drug and suicide in children
and adolescents. The warning urges physicians
to closely monitor young patients on these drugs
especially at the start of treatments.

Expert Advice in Light of McKay

C. A Suicide Case

In what is believed to be the first lawsuit of its


kind, the parents of John McKay have sued an
Internet site, a Colorado physician, and a pharmacy
in Mississippi in federal court in California for
wrongful death and negligence in the suicide of
their son (Ostrov, 2007). John McKay was a Stanford University freshman and debating champion
who used a credit card to order 90 tablets of the
anti-depressant Prozac (generically called fluoxetine) from an online pharmacy. The pharmacy



Parents must monitor the Internet use of


children and young adults.
Parents should check the family mail for
suspicious packages.
Parents should keep credit cards and any
documents with credit card numbers away
from children.
Parents should make certain young people
who are treated for depression are consistently monitored by a physician.

D. An Overdose Case
Christopher Smith made over $24 million dollars
selling prescription painkillers illegally through
his Internet pharmacy before he was indicted
(Unze, 2007). Minneapolis court documents
show that Smith sold prescription painkillers to

Cybermedicine, Telemedicine, and Data Protection in the United States

Justin Pearson at least 12 times in 2004-05. Justin


accidentally overdosed on Christmas Day, 2006.
Two other individuals listed in the governments
filing bought drugs from Smith and committed
suicide or died of medical complications from
prolonged controlled substance abuse within
18 months of purchasing drugs on the Web site.
Smith was convicted of conspiracy to sell drugs
illegally, misbranding drugs, and money laundering. He was sentenced to 30 years in federal
prison August 1, 2007 (Unze). Pearsons family
members petitioned the state legislature to change
the law on Internet pharmacies. Minnesota law
now prohibits doctors from writing prescriptions
for Minnesota clients based solely upon an online
questionnaire.

Expert Advice for Finding Safe Internet


Pharmacies Online

Use a pharmacy in the United States, not


Canada or Europe.
A safe pharmacy has a phone number and
address listed.
A safe pharmacy is one that is licensed
by a government agency or certified by a
reliable organization. (The National Board
of Pharmacy has a Web site verifying U.S.
pharmacies have met state and federal regulations. See www.vipps.info).
A safe pharmacy has a clear privacy policy.
Read it. Make certain it is easy to understand
and explicitly states it will not share your
personal information with others unless you
agree.
Always have a prescription from a physician
or health care professional who has examined
you.
Make certain a licensed pharmacist is available to answer any questions you have.
Price comparisons can be found at Pharmacy
Checker, a company that certifies online
pharmacies. See www.pharmacychecker.
com.

InformAtIonAl And self-help


sItes
A third category of electronically available medical Web sites are those that are purely for information, or provide forums for discussing medical
problems. Web sites proliferate that are available
for educating the public about everything imaginable in health care from routine procedures to
rare diseases. Because of the availability of these
sites, patients can now read excellent articles and
research dealing with a rare disease that they, or a
family member, may suffer from. There are many
stories regarding individuals who diagnosed their
own or others illnesses by using the Internet.
Hospitals as prestigious as the Mayo and
Cleveland Clinic also routinely set up Web sites
that not only provide information about the hospital, but provide information on disease processes
and give health care consumers preoperative and
postoperative instructions. In addition, these
Web sites often provide links to articles about
particular diseases, or even have informational
charts concerning drug interactions. Sites such as
WebMD allow individuals to look up symptoms
or ask medical questions of certain practitioners
in an effort to get feedback on a particular ailment. Finally, listservs and chatrooms provide
individuals the opportunity to share information
about particular medical ailments or seek advice
or support from others dealing with the same
problems. As an example, ACOR.org provides
those afflicted with various cancers to share
information with an international contingency
about their conditions.

prIvAcy concerns relAted to


electronIc sItes In generAl
Since the inception of the Internet and the ability
to enter personal information into a Web site, there
have been problems with ensuring that personal
data remains secure. Today, there are burgeon-



Cybermedicine, Telemedicine, and Data Protection in the United States

ing problems with credit card fraud and identity


theft in the best of circumstances. Connecting to a
virtual site in which information must travel along
millions of shared lines compounds the problem,
even in situations where an individual believes
she is entering into a transaction onto a trusted
site. Even if a company takes the opportunity to
employ encryption methodology (such as Verisign), the opportunity for data stealing remains in
situations where hackers lurk. It is impossible to
say with certainty that any electronic transaction
will remain completely private (Mencik, 1999).
Unfortunately or fortunately, depending upon
personal point of view, there is no overarching
regulation of the Internet. The Internet is not a
place, nor is it contained within a certain jurisdictional boundary. Those setting up Web sites
do not have to sign up with a central authority or
conform to certain rules and regulations. There is
no central place to register domain names and no
periodic reporting that must be done to maintain
a Web site.
Some regulation of business on the Internet
has begun. There are some laws that govern some
types of Internet business and/or transactions;
however, most of these laws draw upon principles
that are more appropriately applied to land-based
businesses and not cyberspace entities. For instance, the Federal Trade Commission is charged
with enforcing various consumer protection laws
such as preventing false advertising and deceptive
trade practices (Granfield, 2000). Both state and
federal law enforcement agencies are charged with
preventing fraud occurring over the Internet, and
there are various statutes, both state and federal,
that deal with preventing the dissemination of
spam (Controlling the Assault of Non-Solicited
Pornography and Marketing Act of 2003).
Nonetheless, no law now in existence is able
to prevent the stealing of personal data, and
few laws can protect individuals whose personal
data is used unknowingly by a Web site for uses
such as targeted advertising. The only way for
individuals to protect themselves to the highest

8

degree possible is to be aware of the privacy policies of the Web site with which they are dealing.
Financial data should be logged only into trusted
Web sites that have encryption methodology in
place, and personal data (such as names and addresses) should be shared only on websites that
have a similar encryption data or on those that
have privacies policies with which the individual
agrees. Most consumers should be aware that the
Internet makes it very easy to share data with
other businesses, and that most businesses would
prefer to engage in quick, targeted advertising.
Any registration with any type of Web site may
make the consumer part of a larger database that
will not only result in unwanted email solicitations,
but will make the consumer more susceptible to
scam advertising.
Consumers should be aware that Web sites,
even those related directly to the medical profession, have different types of privacy policies.
Some highlights from these privacy policies are
included in Table 1.
When accessing any Web site where personal
information is disclosed, consumers who have
any privacy concerns at all should be aware of
the privacy policies related to the Websites. As
the examples indicate, not all information that
might be thought to be private or even protected
by HIPAA is necessarily private. For instance,
there are many pharmaceutical Web sites that
are not United States Web sites and not subject
to United States laws. Although many of these
websites have American sounding names, the
consumer should be aware that when logging
in, entering data, and ordering, that information
entered might not be as private as the consumer
thought.
In addition, many Web sites contain somewhat
of a caveat emptor proviso indicating to the
consumer that if there are any links accessed from
the first website, the consumer should understand
that the privacy policies of the first Web site do
not apply. Further, various Web sites disclose that
private data may be accessed by a company doing

Cybermedicine, Telemedicine, and Data Protection in the United States

Table 1.
Web Md

There is a lengthy privacy policy that includes information about what cookies are
collected. The policy also provides that personally identifiable information will not be
disclosed except 1) to meet legal requirements and 2) when there is a threat requiring
disclosure. The site informs the consumer that the consumer will be informed of
material changes to the policy and provides that complaints may be lodged with
TRUSTe privacy watchdog

American Academy of
Family Physicians (AAFP)

A shorter privacy policy discloses use of cookies and states that member information
may be provided to constituent chapters. The policy also provides that some
information may be disclosed for purposes of targeted sales. There is a disclaimer
providing that some information may be disclosed when legally required. The site also
forewarns the consumer that it cannot be held responsible for the actions of third
parties whom have links within the site.

Merck.com

Merck provides that consumers may elect a level of privacy protection. The policy
states that, Personal information about you will be accessible to Merck, including its
subsidiaries, divisions, and groups worldwide, and to individuals and organizations
that use personal information solely for and at the direction of Merck, and further
provides that information will be disclosed only to those working on its behalf.

American Heart Association

Policy gives consumer ability to opt out of disclosure, but also provides that
aggregate information is sometimes disclosed for research purposes. There may be
disclosure as required by law.

Revolutionhealth

Policy provides that the information provided by the consumer may be used to acquire
information about other people in your demographic area for targeted advertising. The
policy states that information may be disclosed for legal reasons, or when a threat is
involved (e.g., national security). The site has a disclaimer that if third party sites are
accessed, the privacy policies of the third party sites should be reviewed. Consumers
have an opportunity to opt out of particular disclosures.

MedRx-One
(No prescription necessary site)

Non U.S. company; one line privacy policy: medrx-one pledges that the information
you enter will not be shared with any parties not directly involved with the ordering or
delivery process without your expressed consent (except for fraud cases) and that any
e-mails you receive from medrx-one will be related to your order. The terms of use
indicate that local laws (i.e., country of origin) will apply to any legal issues.

CVS

Provides state specific privacy policies in addition to an extensive privacy policies


that mirrors the HIPAA regulations. Provides that information may be disclosed to
business associates provided an appropriate contract exists that safeguards privacy.
Sets out that information may be disclosed in some instances to an individual (friend
or family member) involved in your care, if we can reasonably infer that you agree.

Walmart

Walmarts privacy policy provides, We may use or disclose your PHI for prescription
refill reminders, to tell you about health-related products or services, or to recommend
possible treatment alternatives that may be of interest to you, and, We may
disclose your PHI to a family member or friend who is involved in your medical
care or payment for your care, provided you agree to this disclosure, or we give you
an opportunity to object to the disclosure. If you are unavailable or are unable to
object, we will use our best judgment to decide whether this disclosure is in your best
interests.

business with the first party and that the company


is required by federal law to enter into contracts
ensuring the privacy of data. However, most Web
sites are vague about what data is protected and
do not provide the consumer access to the actual
contract and listing of data that is being disclosed
and how it will be protected.

Finally, various Web sites provide for the


disclosure of data in certain instancessuch
as, in numerous pharmacy sites, disclosure to a
friend or family member who is the caretaker of
the individual seeking a prescription. Although
this proviso may make it more convenient for a
caretaker to access information about meds for an

9

Cybermedicine, Telemedicine, and Data Protection in the United States

individual unable to access the data herself, there


is also a potential for abuse. In many situations, an
individual has sought out the Internet for purposes
of enhanced privacy for the very reason that she
does not want family members to be aware of certain health information. Without any verification
procedures in place, many individuals may have
their medical privacy compromised even though
a website is legitimate, well-respected, and has
various privacy protection procedures in place
that a consumer believes are absolute.
Although there is sometimes recourse when
there has been an alleged breach of privacy occurring by way of an Internet transaction, this
recourse may be limited in scope. As previously
discussed, various federal entities are responsible for ensuring the enforcement of some laws;
however, the Internet provides opportunities for
businesses to both form and disappear overnight
making it not only impossible to find the target of
the investigation, but to determine what would be
the appropriate jurisdiction to handle any complaint. Furthermore, although there are various
actions that might be brought against Web site
companies that do business in the United States
and technically exist in the United States, there
is virtually no recourse available against website
companies that exist outside the boundaries of
the United States. These sites may not have even
minimal privacy protections for consumers.
Certainly, the speed with which Web sites may
be formed and disbanded will serve to hinder any
sanctions or penalties that may be geared toward
the enforcement of privacy rights.
Finally, the varying privacy policies in place on
Web sites make it difficult for consumers to even
know what their rights are, or what rights were
violated. Although most legitimate companies
dealing with medical data have privacy safeguards
in place, there are various loopholes that consumers must be aware of whenever transacting
business on the Internet. These include releases for
targeted marketing in some instances, disclosure
to third party companies with which the original

0

company does business, disclosure to caretakers,


and disclosures for law enforcement purposes
(and when threats are involved).

concerns relAted to the


dIsclosure of medIcAl
InformAtIon
A. patient rights and state
remedies
Medical information is often the most private and
intimate information about an individual. Health
information privacy is based upon two principles:
1) the patient should be entitled to know what is
contained in her own record; and 2) the patient
should be able to control who has access to that
information. States, individually, have long been
the principle regulators of health information.
While physicians have always been obliged under
their ethical obligations and the Hippocratic Oath
to protect health care information, other secondary users in the health chain such as insurers, the
government, and employers have, in the past, not
always had the obligation to keep information
confidential when storing or transmitting it. Only
recently, the federal government has promulgated,
through HIPAA, the Federal Health Privacy
Rule to even out and establish a floor of privacy
protection to all citizens.
While all states have constitutions that may
give rise to a right of privacy, explicit guarantees
of privacy in state constitutions are rare. Applying
privacy protections to health information has occurred in piecemeal fashion with little consistency
from entity to entity and state to state. Thus,
the right of privacy in state law provisions is a
patchwork varying from legislative protection to
only common law provisions.
While every state has a statute restricting the
use and disclosure of medical information, few
states have taken a broad or uniform approach.
Rather, the protection afforded to the information

Cybermedicine, Telemedicine, and Data Protection in the United States

tends either to be specific to a certain condition or


fails to cover much of the information collected.
Most states have some type of statutory providerpatient privilege that affords to the consumer
limited protection of ones health information. But
states vary widely in the scope of the provisions
that are enacted restricting the use of medical
information (Pritts, 2002).
Most states allow health care providers to use
and disclose patient identifying information for
treatment, payment, peer review, and research
purposes. For any use or disclosure of information, not specified by state statute, the patients
written permission is required. Patients should
be informed regarding the information divulging practices of their health care providers. They
are entitled to receive and review their medical
records. In addition, the security of their medical
information should be protected from unauthorized use or disclosure.
To be effective, the state privacy statutes
must provide remedies and penalties for violating
them. Thus, if personal health care records are
not provided to a patient, there must be a sanction for failure to do so. In addition, if health care
providers can divulge information with impunity,
there is no real benefit to consumers.
States run the gamut in the remedies in their
statutory provisions to protect the health consumer. Some states have expressly granted rights
to patients to bring suit for equitable relief and
attorney fees. Other states hold a violator liable
for actual and punitive damages (Pritts, 2002). If
a person can show the privacy violation was made
with knowledge or was made intentionally, many
states allow for criminal penalties including fines
and imprisonment.

b. need for remedies & uniformity


The problem remains that there is no uniform
comprehensive state approach to violations of
health care consumer privacy. Most states have
allowed only some elements of fair and uniform

disclosure into their codes. Many have no remedy


for violations. California has crafted some of the
most privacy-oriented consumer protection laws
in the nation (Cal. Civ. Code 56-56.37). Californias
code affords patients by law rights to most of
the major holders of health information. And the
code restricts disclosures by health care providers, HMOs, and even employers. The California
code also gives patients the right sue to enforce
their rights. Yet, even Californias law is lacking
because there is no provision to require notice of
the health care providers practices and policies to
patients. Individual states have a long way to go
to offer real solutions to the problem of privacy.

federAl prIvAcy protectIons


Applicability of hIpAA to
cybermedicine and telemedicine
With the recent issuance of federal government
regulations governing the use and disclosure of
health information by the Department of Health
and Human Services, the role and importance of
state government remedies has changed. While
HIPAA is the first federal health privacy law, it
does not pre-empt stronger state laws. So state
law can offer greater protection than HIPAA
provides especially with remedies for abuse
(Pritts, 2002).
Unlike many laws that do not contemplate
the incorporation of technology into every day
life, HIPAA was originally enacted for the specific purpose of integrating the use of computer
technology in the dissemination of information.
Thus, after the enactment of HIPAA, traditional
forms of communication were added to coverage
so that there would be no loopholes regarding
the dissemination of information through such
avenues of communication as oral and written
communications. As a result, the application of
HIPAA appears to be broad-based covering traditional forms and both cyber and telemedicine,
at least in large measure (Wachter, 2001).


Cybermedicine, Telemedicine, and Data Protection in the United States

For example, HIPAAs Privacy Rule is


broad-based in its application and protects individualized medical information transmitted in
any form or medium. It extends to all patients
whose information is collected. The Rule imposes
responsibilities on all employees and volunteers
and requires health care institutions to receive
contractual assurances that business associates
that will be handling individualized medical information will keep that data private (pp. 1-4).
Also required, HIPAA mandates that those
dealing with providing health care provide a Notice
of Privacy Practices that will set out how the organization will protect health-related information
and specify a patients rights including:

for the dissemination of this type of information.


There are, however, some types of medical information that are not considered completely private.
This information may be disclosed in limited
circumstances without there being a HIPAA
release. These categories include:

How to gain access to the patients health


records.
How to request corrections to errors in a
patients health records (or voice a disagreement with a medical conclusion).
How to determine where medical information has been disclosed.
How to request additional confidentiality
for particular health information.
How to keep information confidential by
providing alternate contact numbers such
that information will not be given to members of a household.
How to complain about a breach of privacy
protocol.
How to contact the U.S. Department of
Health and Human Services to follow up
about a complaint regarding a breach of privacy protocol (University of Miami School
of Medicine, 2005).

Additionally, the Privacy Rule requires supplemental authorization if a health care provider
intends to disseminate any private information
for the purpose of research, some types of marketing, or fundraising. Information related to
psychotherapy is also considered overly sensitive,
and HIPAA requires supplemental authorization



Information related to public health issues.


Information related to abuse or domestic
violence situations.
Information related to law enforcement, or
some judicial proceedings.
Information that might prevent an imminent
threat to public safety.
Specialized information as required by
particular state or federal statutes.

In addition, not all forms of communication of


medical information fall into the category of marketing. For instance, it is not considered marketing
for a covered entity to use personal information
to tailor a health-related communication to that
individual, when the communication is:

Part of a providers treatment of the patient


and for the purpose of furthering that treatment, such as the recommendation of a
particular medication.
Made in the course of managing the treatment, such as reminders of appointments
(Privacy Rights Clearinghouse).

In many instances, individual state laws will


govern what must be disclosed regardless of the
fact that HIPAA is a federal statute. And one
author has argued that states should not rely
solely on the Federal Health and Privacy Rule to
protect citizens, but that they should at a bare
minimum . . . mirror the federal protections,
thereby allowing enforcement to occur at the state
level (Pritts, 2002). Thus, states are encouraged
to enact legislation spelling out health consumers rights and remedies. In addition, those states

Cybermedicine, Telemedicine, and Data Protection in the United States

that have already enacted comprehensive health


privacy rules should reevaluate their statutes so
they fill in gaps that may exist. For example, use
of health information for marketing purposes is
often ignored in state laws, while some states
have enacted more stringent standards than the
federal law (Boyer, 2004).
Whereas HIPAAs Privacy Rule protects
against the disclosure of certain information,
HIPAAs Security Rule sets out regulations
related to the electronic transmission of data
and thus imposes security requirements on the
electronic systems themselves. Moreover, the
Rule imposes these restrictions on the covered
entities of health care providers, health plans,
and health care clearinghouses. Neither in-person
nor hand-written communications are covered by
the Security Rule, but do fall under the Privacy
Rule which covers any mode of communication
(University of Miami Medical School; 45 C.F.R
164.302, 45 CFR 164.306).
Given that telemedicine stems from a traditional doctor-patient relationship in which consultations are theoretically done remotely, technically, the criteria related to HIPAA would apply
to those practicing telemedicine. However, there
would be some complications related to HIPAA
and privacy in general depending on what type
of telemedicine relationship is established.
In a situation where a patient sees her own
physician and there is a consultation that is done
while her physician is in the room, there is the
opportunity for the original physician to ensure
that the consultant is bound, in writing, by the
prescriptions of HIPAA and that the patient is
informed of her rights under HIPAA should there
be such a consultation. If, however, there is a remote consultation being done without the involvement of an initial more primary care physician,
there is less of an opportunity to verify privacy
rights, especially if there is some immediacy to
the teleconference.
Although it is unlikely, and currently uncommon for an individual to log on to a Web site and
visually present an injury for diagnosis, it is likely

that this type of doctor/patient interaction will


increase in the future. With such an immediate
relationship, there might be little opportunity for
the patient to be told of the websites privacy practices, and there would unlikely be any opportunity
to review and sign appropriate forms.
Thus, although any type of telemedicine practitioner would be bound by HIPAA, the patients
would be well-advised to know the policies of the
particular practitioner before going ahead with
a consultation. In the case where a primary care
physician would be involved in a telemedicine
conference, the patient would be well-advised to
know what privacy policies have been established
between the two entities.
The practice of cybermedicine is similar to
telemedicine and, for the most part, those who
practice cybermedicine are bound by HIPAA as
are those involved with telemedicine. However,
cybermedicine may present more opportunities for compromising privacy because there are
various forms of what might be considered the
practice of cybermedicine that are not subject to
the HIPAA regulations (Lewis, 2004).
Cybermedicine is often associated with prescription drugs, and, true enough, pharmacists
and pharmacies are subject to the regulations of
HIPAA as are the parent companies that might
be engaged in the pharmaceutical business (such
as Rite Aid or CVS). However, cybermedicine
is also associated with such Web sites as Ask a
Physician, Self-Help, or General Advice Web
sites. Even in the case of Asking a Physician,
one might not establish a doctor/patient relationship necessary to bind the doctor in question to
HIPAA as a health care provider.
Although many individuals might ordinarily
believe that information is confidential because
there is a persons medical condition being discussed, that is not the case (Nath, 2006). Many
people routinely log into Web sites to find out
information concerning symptoms, or request
information about a particular drug. Some sites
require information such that a response might



Cybermedicine, Telemedicine, and Data Protection in the United States

go to an email address, or may even require information such as an address. Because these sites
are not health care providers, there is no prohibition that would prevent continuous emailings, or
postal mailings that a person may not want about
a condition that a person may not want others
to know about. Because there is no health care
provider relationship, there is nothing prohibiting any of these sites from selling information
to other entities.
There are numerous privacy concerns that
may develop by virtue of the fact that lines blur
as the electronic trail becomes more extensive. In
an ordinary situation, a health care provider deals
with secretaries who may deal with insurance
companies, who may deal with other insurance
companies, who may deal with outsourcing of
billing and/or data processing, who may ultimately
deal with collection agencies for unpaid bills.
Although each of these entities would technically
be bound to keep data private under an initial
business entity and employee agreement, the more
people involved, the more difficult it is to control
who has what data.
Where telemedicine and cybermedicine are
concerned, the trail goes out even farther. In telemedicine situations, there may be staff in a room
unseen by a patient having nothing to do with the
procedure or medical consultation (such janitors,
computer technicians, or camera operators). For
both telemedicine and cybermedicine, there are
Web designers and engineers who may have access to information in databases within the Web
site. None of these ancillary workers are generally
bound to keep health information private.
When the Web site itself is set up to work as
a place where health care is provided, there is a
good argument that these technical employees
are bound by HIPAA as employees dealing with
health treatment information; however, that characterization is debatable if dealing with a third
party Web administrator (i.e., an independent
contractor), or when the Web administrators are
part of a larger conglomerate, such as when a



retailer (e.g., Walmart) has a site where prescriptions may be filled. There is also some uncertainty
as to what is the status of a third-party owner of
a website when that owner is not in the health
care field but rather has purchased the website
for investment purposes, or even when the Web
site itself is a subsidiary of a health care company
(Nath, 2006).
In all instances involving use of the Internet for
medical care, the consumer must not assume that
whatever personal data is being handled will be
kept confidential. If the consumer is attempting to
protect private data, that consumer must be aware
of the policies of the entity from wherever any
type of medical information is shared.

problems with privacy liability


While HIPAA pre-empts any state law that provides less protection than it does, state laws that
provide greater or equal protections than HIPAA
remain in effect. Certainly, HIPAA privacy rules
allow providers to share private information as
long as it is used for allowable purposes such as
billing and treatment. Most states have placed significantly more limitations on the scope of courts
to order release of personal health information.
Several areas of personal medical information,
including mental health and HIV/AIDS cases
also are protected by state statutes that restrict
disclosure of such information.
There is a common law cause of action that
consumers may have based upon the physicianpatient relationship and privacy (Miller, 2006).
Two examples of such actions include:
1.

2.

A New York court legally enjoined a psychoanalyst from circulation a book in which
detailed information concerning a patient
was written (Doe v. Roe, 1977).
The Oregon Supreme Court ruled that a physician was liable for revealing his patients
identity to the patients natural child who
had been adopted (Humphers v. Inter. Bank,

Cybermedicine, Telemedicine, and Data Protection in the United States

1985). This breach was held to be a breach


of professional responsibility.
Thus, state law may statutorily and at common
law offer greater privacy protection than does
HIPAA privacy rules.

prIvAcy concerns for the


future
The future of medicine and healthcare necessarily
must contain a prediction for the increased utilization of computer information and the Internet.
Any such discussion will contain the exponential
increase in the use of national and international
markets with more chances for individual privacy
to be compromised. There is an increasing focus
on computerized medical records. The Veterans
Administration has adopted computerized record
keeping (Schmidt v. Dept. of Veterans Affairs,
2003). An executive order in 2004 established
the Office of Health Information Technology.
Thus, the federal government is moving forward
in its efforts to use technology in health record
keeping.
The advantages for such record keeping
systems are better patient care information and
fewer iatrogenic illnesses, drug interactions, and
prescribing mistakes. The disadvantage is the
security risk. Each day, the newspaper contains
stories of computer records being stolen or compromised. The HIPAA privacy rules do pertain
to all computerized records and e-mail. Records
and communications of this sort must be protected
with encryption and other security steps. Computerized records have been accidentally posted
on the Internet (Pillar, 2001).
E-mail communications should be handled
with the same discretion that other methods
of communication are treated. E-mails can be
requested in discovery and may be used in civil
and criminal actions: e-mails tend to exist in cyberspace forever. Stories abound regarding health

care information emails that have accidentally


been sent to hundreds of the wrong recipients
(Salganik, 2000). The law relating to computers,
e-mails, and the Internet is still developing.
Individuals deserve privacy in their dealings
with physicians, health maintenance organizations, billing agencies, pharmacies, and other
entities. The remedies, both civil and criminal,
offered to consumers for breaches of privacy
are piecemeal at this time and consist largely of
common law actions or those statutorily provided
by each state.
The federal HIPAA law articulates the
minimum protection that may be afforded to
individuals. There is no uniform consistent
statutory scheme to protect individual privacy
of medical information nationally. Each state is
free to create its own legislation. It is important
to minimize the accidental release of private
information. The law is continually developing
in the area of privacy, the Internet, e-mails, and
computer-stored information. New legal issues
emerge each day that need attention in this ever
expanding area. Federal law must be enacted to
deal with Internet privacy violations uniformly
and provide deterrents for violating the law with
significant fines and criminal penalties for the full
range of patients privacy violations from merely
negligent to that purposeful criminal violation of
patients privacy. Web sites that provide drugs
to all comers with no physician oversight should
be illegal.

references
42 C.F.R. 482.51
42 U.S.C. 290dd-2 (2006).
American Board of Medical Specialties. (2007).
Retrieved June 1, 2007 from http://www.abms.
org



Cybermedicine, Telemedicine, and Data Protection in the United States

American Medial Association. (2007). Informed


consent. Retrieved June 1, 2007, from http://www.
ama-assn.org/ama/pub/category/4608.html
Bang v. Miller, 88 N.W. 186 (Minn. 1958). Retrieved June 1, 2007, from LexisNexis, Cases,
Law School database.
Beck, J., & Herrmann, M. (2007). Drug and
device law: HIPAA does not preempt state litigation practice. Retrieved July 2, 2007, from http://
druganddevicelaw.blogspot.com/2007/02/hipaadoes-not-preempt-state-litigation.html
Bogozza, D. (2006). WHO and partners accelerate
fight against counterfeit medicines. PR Newswire
US. Retrieved June 5, 2007, from LexisNexis
News

Cyberdocs today announced the rirst virtual


doctors [sic] office on the world wide web (1996).
Westlaw M2 Presswire. Retrieved June 1, 2007
from 1996 WL 11276562
Denoon, D. J. (2007). Internet drug pushing up
again. More websites advertising, selling controlled prescription drugs. Health News. Retrieved
June 21, 2007, from WebMD, http:// www.webmd.
com/news/200770518/interent-drug-pushing-upagain
Doe v. Roe, 93 Misc. 2d 201, 400 N.Y.S.2d 668
(Sup. Ct. 1977).
Food and Drug Administration, FAQs. Retrieved
September 26, 2007, from http://www.fda.gov/oc/
buyonline/faqs.html

Bommareddy v. Superior Court, 222 272 Cal.


Rptr. 246 (Cal. Ct. App.1990). Retrieved from
LexisNexis Cases, Law School database.

Fox v. Smith, 594 So. 2d 596 (Miss. 1992). Retrieved June 1, 2007, from LexisNexis Cases, Law
School database.

Boyer, M. C. (2004). Texas administrative agencies tackle compliance with the health insurance
portability and accountability acts privacy rule.
Texas Tech Journal of Administrative Law, 5,
100-111.

Goldfarb v. Virginia State Bar, 421 U.S. 773,


792 (1975).

Cal. Civ. Code 56-56.37 (LexisNexis 2007).

Granfield, A. (2000). FTCs bite out of internet


fraud lacks teeth. Forbes.com. Retrieved July 2,
2007, from http://www.forbes.com/2000/03/27/
mu2.html

Can Google diagnose illness better than doctors? (Nov. 6, 2006). The Daily Mail (London).
Retrieved June 4, 2007, from http://www.seroundtable.com/archives /006667.html
Chamness, J. N. Liability of physicians for communicating over the internet. Retrieved June 1,
2007, from http://www.Cornelius-collins.com/
CM/Whats New/asp
Chiang, M., & Starren, J. (2002). Telemedicine
and HIPAA: Data confidentiality and HIPAA.
Retrieved May 23, 2007, from http://www.ideatel.
org/syllabus/hipaa.html
Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003, 15 U.S.C.
7706.



Granade, P. F., & Sanders, J. H. (1996). Implementing telemedicine nationwide: Analyzing the
legal issues. 63 Defense Counsel, 67.

Gregg, J. (2007). Senator Gregg introduces bill


to increase safety for Americans buying on-line
prescription drugs. States News Service. Retrieved
June 5, 2007, from LexisNexis News library.
Harrington, K. (1999). Legal implications of the
practice of medicine over the internet, telemedicine and cybermedicine. Cyberlaw. Retrieved
July 2, 2007, from http://www.gase.com/cyberlaw/toppage11.htm
Health Insurance Portability and Accountability
Act of 1996, 42 U.S.C. 1320(d).

Cybermedicine, Telemedicine, and Data Protection in the United States

Humphers v. First Interstate Bank, 696 P.2d 527


(Or. 1985).
Iredale, W., & Swinford, S. (2007, February 25).
Web supplies prescription drug addicts. Sunday
Times (London). Retrieved June 3, 2007, from
LexisNexis News file.
Johnson, A. (June 2006). Digital doctors. National Conference of State Legislatures Magazine.
Retrieved June 1, 2007, from http://www.ncsl.
org/programs/pubs/slmag/2006/org
Kohlman, R. J. (2006). Existence of physician
patient relationship. 46 Am. Jur., Proof of Facts
2d 373. 1, 2, 3, 8.
Kramer, D. (2007). American jurisprudence: Supremacy of constitutions; supreme laws (2nd ed.).
Minneapolis: The West Publishing Company.
Lade, D (2007, July 15). Getting medication in
privacy is part of internet appeal, but there are
risks. Sun Sentinel. Retrieved September 17, 2007,
from LexisNexis current News file.
Lewis, C. A. (2004). Cybermedicine: Do you
need to know or see your physician, or is it safe
to practice medicine in cyberspace? Retrieved
March 28, 2007, from http://gsulaw.gsu.edu/lawand/papers/fa04/lewis/doc/htm
Lisse, J. (2007). Bechet disease. Retrieved June
3, 2007, from http://www. emedicine.com/med.
topic218.htm
Mack, A. (2000, January 3). Op-Ed Re: Sometimes
the patient knows best. New York Times. Retrieved
June 5, 2007, from LexisNexis News service.
McGrath, R. (2005). Only a matter of time: Lessons unlearned at the food and drug administration keep Americans at risk. Food and Drug Law
Journal, 60, 603-624.
MedicineNet. (2007). Orphan disease definition.
Retrieved June 4, 2007, from http://www.medterms.com/script/main/art.asp?articlekey=11418

Mencik, S. (1999). Are secure internet transactions really secure? Retrieved June 6, 2007, from
http://www.jsweb.net/paper.htm
Metz, J. (2004). Practical insights to HIPAA:
Overview and general background regarding
HIPAA. Retrieved May 21, 2007, from http://www.
dilworthlaw.com/pdf/hipaa.pdf
Miller, R. D. (2006). Health care law. Sudbury,
Massachusetts: Jones Bartlett Publishers.
Moser v. Stallings, 387 N.W.2d 599 (Iowa 1986).
Retrieved June 1, 2007 from LexisNexis law
School database.
Nath, S. W. (2006). Relief for the e-patient?
Legislative and judicial remedies to fill HIPAAs
privacy gaps. George Washington Law Review,
74, 532-540.
ODonnell v. Blue Cross Blue Shield of Wyo., 173
F. Supp. 2d 1176, 1179-80 (D. Wyo. 2001).
Office of Health Information Technology (2004).
Retrieved June 21, 2001 from http://www.Whitehouse.gov/omb/egov/gtob/health_informatics.
htm
Ohio Rev. Code Ann. 2743.43 (A)-(D) (LexisNexis 2007) Expert testimony on liability issues.
Retrieved June 1, 2007, from LexisNexis Ohio
Cases.
Ostrov, B. (2007, March 14). Menlo Park teens
suicide shines light on shadowy market. The
Mercury News. Retrieved September 18, 2007,
from LexisNexis News library.
Patients Rights Conditions of Participation
(Revised 2007). 42 C.F. R. 482.13, 482.24 and
482.51.
Pettus v. Cole, 57 Cal. Rptr. 2d 46 (Cal. Ct. App.
1996).
Pillar, C. (2001, November 7). Web mishap:
Kids psychological files posted. L.A. Times,
Nov., A1.



Cybermedicine, Telemedicine, and Data Protection in the United States

Pritts, J. L. (2002). Altered states: State health


privacy laws and the impact of the federal health
privacy rule. Yale Journal of Health Policy, Law
& Ethics, 2, 334-340.

Tyler, B. J. (1998). Cyberdoctors: The virtual


housecallthe actual practice of medicine on the
internet is here; is it a telemedical accident waiting
to happen? Indiana Law Review, 13, 259-290.

Privacy Rights Clearinghouse. (2006). HIPAA


basics: Medical privacy in the electronic age.
Retrieved May 23, 2007, from http://www.privacyrights.org/fs/fs8a-hipaa.htm

University of Miami Miller School of Medicine


Privacy/Data Protection Project. (2005). Privacy
Standard/Rule (HIPAA). Retrieved May 22, 2007,
from http://privacy.med.miami.edu/glossary/
xd_privacy_stds_applicability.htm

Public Welfare. 45 C.F.R. 160 et seq.


The Safe Internet Pharmacy Act of 2007, S.596,
110th Congress, 1st Session (Fall 2007) Retrieved
June 5, 2007, from LexisNexisCongrssional
database.
Salganik, M. W. (2000, August 10). Health data on
858 patients mistakenly emailed to others; medical information was among messages sent out by
Kaiser Health Care. Baltimore Sun, Aug., 1C.
Schmidt, C. W. (2000). Patient health information goes electronic. Retrieved July 2, 2007, from
http://pubs.acs.org/subscribe/journals/mdd/v03/
i09/html/rules.html
Schmidt v. Dept of Veterans Affairs, 218 F.R.D.
619 (E.D. Wis. 2003).
Scott, R. (2001). Health care symposium: Ehealth: The medical grontier: Cybermedicine and
virtual pharmacies. West Virginia Law Review,
103, 412-413.
Shah, R., & Piper, M. H. (2007). Wilson disease.
Retrieved June 4, 2007, from eMedicine-Http://
www.emedicne.com/med/topic2413.htm
Symptoms of Cushing Syndrome. (2007). WrongDiagnosis.com. Retrieved June 3, 2007, from
http://www.wrongdiagnosis.com/c/cushings_syndrome/symptoms.htm
th

Telemedical Report to Congress, 104 Cong., 2d


Session (1996).

8

Unze, D. (2007, August 3). Internet operation


puts man in prison. St. Cloud Times. Retrieved
September 17, 2007, from LexisNexis Current
news file.
Wachter, G. W. (2001). Law and policy in telemedicine: HIPAAs privacy rule summarizedwhat
does it mean for telemedicine? Retrieved May 30,
2007, from http://tie.telemed.org/articles/article.
asp?path=legal&article=hipaaSummary_gw_
tie01.xml

suggested reAdIngs
Beaver, K., & Herold, R. (2003). Practical guide
to HIPAA privacy and security compliance. New
York: CRC Press.
Hall, G. (2006). Privacy crisis: Identity theft
prevention plan and guide to anonymous living.
United States: James Clark King, LLC.
HIPAAHealth Insurance Portability and Accountability Act of 1996, 42 U.S.C. 1320(d).
Sanbar, S. S., Fiscina, S., & Firestone, M. H.
(2007). Legal medicine (6th ed.). Philadelphia:
Elsevier Health Sciences.
Slack, W. V., & Nader, R. (2001).Cybermedicine:
How computing empowers doctors and patients
for better health care. United States: Jossey-Bass
Inc.

Cybermedicine, Telemedicine, and Data Protection in the United States

Web sItes
Data Privacy Lab. (2007). The laboratory at
Carnegie Mellon seeks balanced integrated solutions that weave technology and policy together.
Retrieved, November 4, 2007, from http// www.
cs.cmu.edu

Privacy Rights Clearinghouse. (2007). HIPAA


basics: Medical privacy in the electronic age.
Retrieved November 5, 2007, from http://www.
Privacyrights.org/inquiryform.html

9

0

Chapter XIX

Online Privacy Protection in


Japan:
The Current Status and Practices
J. Michael Tarn
Western Michigan University, USA
Naoki Hamamoto
Western Michigan University, USA

AbstrAct
This chapter explores the current status and practices of online privacy protection in Japan. Since the
concept of privacy in Japan is different from that in western countries, the background of online privacy
concepts and control mechanisms are discussed. The chapter then introduces Japans Act on the Protection of Personal Information along with the privacy protection system in Japan. Following the discussion
of the privacy law, Japans privacy protection mechanisms to support and implement the new act are
examined. To help companies make smooth adjustments and transitions, a four-stage privacy protection
solution model is presented. Further, this chapter discusses two case studies to exemplify the problems
and dilemmas encountered by two Japanese enterprises. The cases are analyzed and their implications
are discussed. The chapter is concluded with future trends and research directions.

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Online Privacy Protection in Japan

IntroductIon
In the past, privacy protection was not considered
as necessary for business in Japan. Instead, the
market determined how companies were to deal
with consumer private information. However,
information technology (IT) has advanced rapidly and all business standards were changed
to use electric files. Companies began to store
tremendous amounts of information to a database rather than using paper-based file cabinets.
IT has changed business structure but it has
also exacerbated privacy problems, private data
leaks, unauthorized data collection, and the loss
of private data.
After more and more privacy-related problems
were revealed by the media, consumers began
to pay attention to the protection of their private
information. As a result, the Japanese government established the Act on the Protection of
Personal Information in 2005 to protect consumers and regulate companies business activities
associated with customers private information
(Yamazaki, 2005). After this law was launched,
many companies exposed their weaknesses in
their privacy protection system and unethical
private data use. The role of privacy had begun
to shift to the consumer side. When consumers
decided to purchase or do business transactions
online, they assumed that there would be a reliable system and trustworthy privacy protection
(Tahara & Yokohari, 2005).
The organization of this chapter can be
overviewed as follows. In the next section, the
background of online privacy concepts and
control mechanisms are discussed. The chapter
then explores Japans Act on the Protection of
Personal Information along with the privacy
protection system in Japan. Following the discussion of the privacy law, Japans privacy protection mechanisms to support and implement the
new act are discussed. To help companies make
smooth adjustments and transitions, the authors
present a four-stage privacy protection solution

model. Further, this chapter discusses two case


studies to exemplify the problems and dilemmas
encountered by two Japanese enterprises. The
cases are analyzed and their implications are
discussed. The chapter is concluded with future
trends and research directions.

bAckground
The concept of privacy in Japan is different from
that in western countries. Japan is a country with a
very high density of population. People are living
right next to each other, and it seems like there
are no boundaries and there is no privacy. However, these are the characteristics of the Japanese
people who indeed understand and respect privacy
(Makoto et al., 2005). Even though there is only a
thin wall between rooms, people can have privacy
with as if behavior. For example, even though
one person knows anothers secret, he or she will
act as if they do not know the secret (Mizutani,
Dorsey, & Moor, 2004). It describes how Japanese
people respect each others boundaries and help
keep secrets. It is also important to understand that
the Japanese culture is group-based. Within the
group, people respect each other and try to think
and move in the same direction. Although they
have their own minds and thoughts, they always
consider how other people in the group think first,
and then decide what to do, heavily depending on
the groups opinions. Often people do not use eye
or body contact as frequently as western people
do because of the different perception of privacy
they have in mind (Makoto et al., 2005).
However, the Internet has created a new environment for privacy. People can obtain, access,
and manage enormous amounts of information
without actual face-to-face interaction. People
can express their opinions anonymously and
they can act any way they like on the Internet.
Anonymity has a major impact on the Japanese
conception of privacy because people no longer
have to depend on the group mind, since on the



Online Privacy Protection in Japan

Internet information can be easily reached and


individual thinking is promoted and encouraged.
In other words, before the Internet was introduced,
the group-based society had bound most peoples
behavior. However, after the Internet became
everyones platform, limitations from the group
society were diminishing, which, on the other
hand, advanced the concept of privacy protection
in Japan (Mizutani et al., 2004).
Since information technology has been growing up rapidly and companies can manage and
collect massive amount of private information,
this traditional privacy protection has not been
working as strongly as it once did. According to
the Ecom Journal (Japanese), the online market
in Japan was $100 million in business-to-business
(B2B) and $13 million in business to customer
(B2C) in 2005, which is almost 20% growth as
compared to 2004 (Goto, 2006). As a result of this
rapid growth, consumers gave up private information that was then used by business companies
without their permission.
The private information that was collected in
an illegal way had been shared, sold on online,
used for unethical ways, and leaked outside of
the company. There were many law suits from
consumers, which gradually drew the media and
consumers attention to privacy. The unethical
business practices had occurred more often because there had not been any related privacy laws
for confining commercial companies practices
until 2005. Consumers were educated to select
reliable companies with their awareness of their
privacy and security when they released their
private information to companies (Tahara &
Yokohari, 2005).

Definition of Online Privacy in Japan


Private information is defined in Japan as any
information which a living individual can be
identified with, such as name, address, birth date,
and phone number. If information can connect a
person with another source of information, such



as an employee ID within an employee database,


it is also considered private information. The
purpose of Japans Act on the Protection of Personal Information (2005) is to protect the rights
and interests of individuals to control collection
and usage of private information.

privacy control mechanisms


Platform for Privacy Preferences (PP)
Platform for privacy preferences (P3P) is technology, built into a Web browser, such as IE, that
helps consumers when they visit a companys
Web site. P3P was developed by the World Wide
Web Consortium (W3C). P3P automatically
matches up a companys privacy policy on the
Web site and consumers privacy preferences
and lets consumers know what privacy policies
match (Analysis of Japanese privacy protection
technology, 2005).
Consumers can set up their preferences by
choosing a privacy level, which decides how much
private information they want to share with others.
P3P protocol is built on HTTP to communicate
between consumers Web browsers and e-commerce Web sites. Once P3P is set by the consumer,
it begins to match up all websites that he or she
visits and returns the result automatically.
Privacy policies are changing all the time and
since most of them are very long and complicated,
consumers need to take more time to understand
them. The advantage of using P3P technology
is that consumers do not have to keep updating
themselves to understand a privacy policy. In
other words, no matter how privacy policies are
changing, P3P will give consumers the results of
matchup and consumers can choose whether they
would accept or not. Not only can companies post
their privacy policies on their Web sites, but they
can also enable P3P functions that consumers can
easily recognize. It is important to attain trust from
consumers in order to establish a good relationship.
P3P can help companies distinguish themselves
as trustworthy companies (Cranor, 2001).

Online Privacy Protection in Japan

Private Information for Third-Party Use

Opt-In

In order to prevent companies from letting a third


party use private information without any permission from the individuals involved, two policies
have been introduced to take care of private
information from consumers for third-party use:
opt-in and opt-out. As a response, many laws and
guidelines have been applied with these policies
for many countries (Bouckaert et al., 2006). The
basic standard of these policies is that a company
must show its privacy policy on their Web site.
However, consumers private information is taken
care of by different processes depending on which
one of them the company selects.

As opposed to opt-out, opt-in policy requires a


company to obtain a consumers explicit consent
for third party use before consumers input their
private information. The company must post
their privacy policy and give consumers the
opportunity to choose options such as whether
they agree with the companys privacy policy
and third party use and whether they want to
get direct advertising e-mails from the company
(Bouckaert et al., 2006).
The opt-in policy puts more weight on the
consumers side, instead of the companys. It
creates more limitations than the opt-out policy
because consumers can choose before they send
private information to the company. However,
it can get more consumer satisfaction for the
company.

Opt-Out
In opt-out, a company must guarantee the right of
individuals to opt out their private information.
It requires companies to post their privacy policy,
including the subjects about opt-out policy, on
their Web site so consumers can access to understand before they actually send their private
information to company. Companies also have to
provide the opportunity for consumers to request
that their private information not be used for third
party use whenever they want to do so (Tahara
& Yokohari, 2005).
Once a company collects private information
according to the opt-out policy, it can use private
information for third party use until consumers
ask to stop doing it. It is considered as an agreement for third-party use between consumers and
company when consumers send them out to the
company. Under the opt-out policy, a company is
willing to share private information with a thirdparty, such as direct phone marketing and direct
e-mail advertisement. Most consumers are not
aware of this policy and in most cases companies
can take advantage of consumers to make their
business easier.

Online Privacy Seals


The third-party assurance (TPA) seal program provided by a third party helps consumers distinguish
websites that got certified from those who are not
yet certified. The third party assesses websites that
want to get certification from the seal program and
if it qualifies, the company can get certified and
put the trust mark on their Web site. Companies
can show that they have strong privacy security
policies and practices to consumers. There are
third-party assurance seal programs available.
TrustE and BBBonline in the U.S. are the two
major seal programs organizations in privacy
assurance (Kimery & McCord, 2006).
TrustE was established in 1997 as an independent non-profit organization to protect
consumers privacy rights and promote a safer
Internet marketplace. Its main purpose is to build
consumer confidence in e-commerce. To accomplish its goal, TrustE is using the seal program,
a certification program for Internet consumer
privacy. If a company wants to get a certificate
from TrustE, they need to apply and get accessed



Online Privacy Protection in Japan

by the organization. There are a few requirements


that must be met in order to get compliance, which
are 1) a participating merchant must prominently
display a privacy policy statement on its Web site,
2) consumers must be allowed to choose whether
or not their personal information is subject to
secondary use by the merchant, 3) consumers
must have access to information collected about
them, and 4) reasonable security measures must
be employed to maintain the privacy of personal
information (Benassi, 1999). Once the company
obtains a certificate, it can put the TrustE trust
mark on its Web site. The company can show its
reliability related to privacy policies and practices
to consumers using this certificate.

onlIne prIvAcy protectIon In


JApAn
The development of Japans Act on the Protection
of Personal Information (thereafter the Act) began
in 1980. The Organization for Economic Cooperation and Development (OECD) first released
guidelines for privacy in 1980. As a response,
legislation for the protection of private information
possessed by the government was established in
Japan. In 1995, EU established the EU Privacy
Protection Act. This law limited the transfer of
private data from EU to other countries if there
was no privacy law which was as good as what
EU had in the other country. During that period
of time, there was no law enforcement or protection for the personal information of commercial
companies in Japan, and the Japanese government
basically relied on the market to decide how to
manage private information. After the EU privacy
act, the Japanese government began to shift bases
from market control to law control for private
information protection (Okamura, 2004).

the Act
Until 2005, there was no law enforcement for
privacy protection in Japan but only guidelines.


Japanese companies did not pay attention to


privacy protection because consumers could not
see how their private information was handled.
As a result, there have been many privacy violations, especially in the case of personal private
information leaking. For example, in 2003, there
were 560,000 reported incidents of leaked private
information according to an investigation by
Lawson, one of the biggest convenience store
companies in Japan. According to the report,
Lawson was outsourcing the maintenance of its
customer database to another company. The private information was leaked from the outsourced
company, while, ironically, Lawson did not even
have an access account to its own customer database. Lawson responded to the incident poorly,
sending a gift card of merely five dollars to all
customers whose information had been leaked.
The mentioned example was just one of many
violation cases in which the private information
of consumers was poorly protected.

contents of the Act


In 2005, the Japanese government enforced the
Act on the Protection of Personal Information,
which consists of six sections. Its purpose is to
protect the right of all individuals to know how
their private information is taken care of. It is
also meant to protect their right to request that
companies not release their private information
(Tahara & Yokohari, 2005). The six sections of
the Act include: (a) general provision, (b) responsibilities of the state and local public bodies, (c)
measures for the protection of personal information, (d) duties of entities handling personal
information, (e) miscellaneous provisions, and
(f) penal provisions.
The first three sections are for government
and local public bodies. The fourth section targets
commercial companies. There are three regulatory codes and five obligations in this section.
This law prohibits companies to sell, transfer,
and use any private information without receiv-

Online Privacy Protection in Japan

ing permission from the involved individual.


When the information is collected, the company
has to inform its consumers of how their private
information will be used. The company is also
responsible for protecting and managing the collected private information under a secure and safe
environment. If a data leak happens, it will be the
companys fault.
The three regulatory codes are listed (Tahara
& Yokohari, 2005):

A company is not allowed to use private


information beyond its original intended
purpose.
A company is not allowed to attain private
information via illegal methods.
A company is not allowed to give private
information to a third party without receiving permission from the individual to whom
the information belongs.

Moreover, the five obligations are detailed as


follows (Tahara & Yokohari, 2005):

A company must specify its reason for using


private information.
A company must inform consumers its
purpose of using private information when
the company collects their information.
A company must protect and manage private information under a secure and safe
environment.
A company must manage and observe
all employees and its relevant outsourced
companies which use private information.
A company must release correct private information and stop using it when consumers
want to do so.

scope of the Act


The Act defines the scope of its application based
on the amount of the companys private data
(more than 5000), the method for storing private

information (organized database), and the period


that is observed (past 6 months). According to
the Act, private information is based on a person, not on a segment defined by the company.
For example, even though a company has three
segments of private information for one person
(name, address, and drivers license number), it is
counted as one. It includes not only consumers
personal information but also any other private
information, such as employees and potential new
employees private information, and other related
private information owned by each employee,
such as customers and co-workers information
on employees laptop computers.
Organized private data means that the company has a database to manage private information. The database created by the company makes
searching and retrieving consumer private data
more easy and efficient. A 6 month period in the
Act means that a company will have more than
5000 consumers private data more than once
at any point during the past 6 months. Even if a
company has less than 5000 right now, it might
be in the scope of the Act if its amount of private
data exceeded the defined 5000 records during
the past 6 months.
In short, if a company stored more than 5,000
private records on its database more than once
during the past 6 months, it would be regulated
by the Act.

Classification of Private Information


Once a company is in the scope of the Act, it must
ensure each type of private information. There
are three types of private information defined in
the Act, which each requires different obligations
from the company (Arai, 2005). The three types
include personal private information, personal
private data, and retained personal private data.

Personal private information: This type of


information is about the living individual,
such as name, address, birth date, and any



Online Privacy Protection in Japan

other information that can be used to identify


the individual. However, this information is
not organized and not in a database.
Personal private data: If personal private
information has been organized and put in
a database to allow fast and easy search,
this information is categorized as personal
private data. The assumption is that personal
private data will be deleted in 6 months and
the company does not have any authority to
release and correct the data. For example,
if a company is contracted to create a database for another company as a part of its
outsourced task, then the database will be
deleted in 6 months and the company does
not have the right to change any data after
that.
Retained personal private data: If personal
private data will be kept for more than 6
months and the company has the authority
to release, correct, stop usage, and delete
data, then the data is categorized as retained
personal private data.

Obligations Defined for Each Type of


private Information
All three types of personal information are required to uphold the following obligations (Tahara
& Yokohari, 2005):

Releasing, correction, addition, deletion,


and stop using of private data

Since the definition of information can be broad


and vague, this classification can help company
understand the boundaries of private information.
Each category has its own definitions of private
information and obligation. In summary, personal
private information is unorganized private information, such as paper based information, which
has not been sorted by index and cannot be retrieved easily. In other words, it is raw information
before the company has a database to organize and
manage it. However, this type of information must
be deleted in a 6 month period, and private data
cannot be published or modified. For example, if
a small company organizes private information to
create a customer database for another company
as an outsourcing task, the private information
will be deleted after the job is completed. In this
case, the private information is considered as
personal private data. Lastly, retained personal
private data is similar to personal private data in
the form of organized private information which
is stored in a database. However, in this case,
the company has full authority and is allowed to
release, correct, stop using, and delete data. It can
also use the information for its business and keep
the information for more than 6 months.

limitations



Specify the purpose of usage of private


information
Legal private information collection
Inform of privacy policy
Customer claim support
Personal private data and retained personal
private data:
Maintain safety and secure environment for
storing private data
Observation of employees business practice
Observation of outsourced company
Limit data transfer
Retained personal private data:

Data Collection
The Act does not define any subjects as prohibited to collect private data from consumers. In
the U.S., it is prohibited to collect sensitive data,
such as political, religion, racial, and ethical view,
medical history, sexual orientation, and criminal
record. In addition, there is the Children Privacy
Act to protect young people under the age of 13.
However, no such types of protection laws exist
in Japan and companies can collect any kind of
data from anyone (Yamazaki, 2005).

Online Privacy Protection in Japan

Data Transfer

Private Data After Used

Data transfer is not prohibited by law in Japan.


A company can transfer private data to other
countries that do not have privacy laws comparable with Japans. This means that violation of
privacy laws could easily occur in third countries.
Even though a company located in Japan meets
Japanese privacy laws, its customer data easily
be leaked by an outsourced company in a third
country that does not have any privacy policies
or laws. Nevertheless, the global standard of the
privacy protection act prohibits this type of activity
to secure consumers data, especially in Europe
(Yamazaki, 2005).

Private data that is used based on company privacy


policies and indicates what kind of purpose the
company is using the consumers private information for can be kept in the companys database
even though the company cannot use them for
some other purpose. The Act does not address
private data after they are used. After its task is
completed, the company does not have to delete
all private data that have already been used. If a
company is acquired by another company, private information is transferred to the acquiring
company and can be used for marketing or other
purposes (Yamazaki, 2005).

Obligation of Informing Data Leak


There is no obligation found in the Act that states
that companies must inform consumers when a
privacy violation occurs. Many states in the U.S.
already have legislation that protects consumer
privacy; that is, the company is required to inform the customers involved in consumer privacy
breaches so that the customers can prevent further
unknown activities involving their information
(Yamazaki, 2005).

Right to Control Private Information


Consumers give up most of their right to control
from the moment they send their private information to a company. Consumers can request to
update private information from the companys
database if they find it incorrect. In this case, the
company is required, by law, to correct or delete
wrong information from its database. However, it
does not have to delete correct private information
from its database even if consumers ask to delete
it. The company simply does not have any obligations to remove customers private information
under the Act (Yamazaki, 2005).

onlIne prIvAcy protectIon


mechAnIsms In JApAn
The three types of privacy protection mechanisms
in Japan (Opt-out, P3P, and the privacy seal program) were discussed earlier. Opt-Out is the way
companies collect private data from consumers.
P3P is the technology used to communicate between a companys privacy policy and consumers
preference. P3P can help consumers understand
the difference between them. It is useful under
the situation where privacy policy is changed too
often. Thirdly, the privacy seal program is for
licensing a program to own a privacy mark on
the companys Web site. The company has to be
certified for its privacy protection system based
on JIS Q 15001 standard to join the privacy mark
owner. Privacy mark represents how reliable the
company is online.

opt-out policy for third-party use in


Japan
The Act forbids a company to pass private information to a third party without receiving the
individuals permission. The company must have



Online Privacy Protection in Japan

permission to give private information for the


third partys use. As a result, the company must
use the opt-in or out policies to deal with this
situation (Tahara & Yokohari, 2005).
In Japan, most companies use the opt-out
system, which shows the companys privacy
policy on its Web site. If consumers release their
private data to the company, it is assumed that
consumers agree with the privacy policy of the
company as well as that of the third party. The
Act requires companies to include four specific
articles for the opt-out policy in their privacy
policies, which include (1) the private information
for third party use, (2) list of private information
for third party use, (3) the way that company
gives private information to a third party, and (4)
when consumers request the company to disallow the third party to use their information, the
company must accept the request. As long as a
company posts sufficient privacy policies on its
Web site, it can use an opt-out system in Japan
(Syukunan, 2007).
However, consumers might not realize that they
agreed with a companys privacy opt-out policy
until they receive unknown e-mail ads. In other
words, there is no option for consumers to choose
no matter whether they agree or disagree to allow
a third part to use their information.
In the U.S. the opt-in system is used, which
is opposite to the opt-out system. Both show
their privacy policy on their Web site and collect consumers private data. However, the main
difference is that in the opt-in system, consumers have the right to choose their options before
their private information is used (Bouckaert et
al., 2006).
The global trend is shifting to opt-in systems
because when more choices are offered to consumers, it has shown to increase their satisfaction. Many Japanese companies are following
this trend to adopt opt-in systems, but still many
companies remain using opt-out systems because
of their convenience.

8

p3p system
The new media development association in Japan
decided to adopt the platform for privacy preferences (P3P) for privacy protection. However,
the adjustments based on the Japanese law and
customs are required to use P3P in Japan because
the P3P was mainly developed in Europe and
the U.S. According to a survey done by the new
media development association (2006), 11% of
companies used P3P on their Web site in 2005.
Comparing the surveys from 2004 and 2005,
the use of P3P decreased by 7% because of the
following issues:

P3P cannot cover Japanese privacy policy.


There is no manual to install P3P.
No browser is fully capable of using P3P.
Translation is not native so consumers hardly
understand it.
Familiarity of P3P is low in Japan.

In order to promote P3P as a standard technology in Japan, it is important to clear these issues.
The history of privacy protection in Japan is
not long and consumers attention remains low.
Consumers involvement is key to the popularity of protection mechanisms and will help these
mechanisms become used by more and more
companies in Japan.

privacy seals
In Japan, there is a privacy seal system called
privacy mark. They have a cross relationship
with TrustE and BBBonline in the U.S., privacy
mark uses JIS Q 15001, a privacy protection
management standard, to observe candidates. If
candidates are qualified, they can use privacy mark
on their Web site and join other privacy-marked
companies (Okamura, 2004). The privacy mark
system started in 1998. At that time there was no
law to protect private information. As a response,
Japan Private Information Development Asso-

Online Privacy Protection in Japan

ciation initiated a licensing system for privacy


protection (Purpose of Privacy Mark System,
2007). The privacy mark license would last for
2 years, and the company must be certified again
to use the privacy mark. In 2007, more than 8000
companies used privacy mark.
Since the Act was activated in 2005, the privacy
mark system has received far more attention than
before. Privacy mark is one of the key elements
to obtain businesses from large companies. As
mentioned previously, there are many outsourcing
tasks available, such as data entry and delivery.
However, a company must have a good privacy
protection system installed in the company as
required by the Act. As a result, companies use
privacy marks as a rate of privacy protection when
they outsource tasks in Japan. If a company has
privacy mark, it will have a significant advantage
when obtaining outsourcing jobs (Tahara &
Yokohari, 2005).

because of carelessness and a lack of attention on


the Act. In order to get all employees involved in
this stage, top executives should have all departments submit detailed lists of private information
and create a master list of private information,
including how private information is kept. The
master list of private information should include
the following items:

four-stAge prIvAcy
protectIon solutIon model
Privacy protection has become a major issue in
Japan since the Act was in effect. Companies have
been trying to integrate their systems and receive
their customers confidence. In this section, the
guidelines to establish a privacy protection system
are presented. Adapted from Tahara and Yokohari
(2005) and Yamazaki (2005), a four-stage model
is proposed as follows.

stage 1: check all personal private


Information in the company
There is large amount private information in the
company. What the company needs to do first is
to check and control all private information. If
any unknown private information exists in the
company, it is impossible to protect it. According
to the consumer claim service center in Japan,
most violations of private information occurred

Purpose of usage of private information


How private information was collected
Number of private information
Contents of private information
Whether private information is edited and
added or not
Store location and how they are stored
Name of the users of private information
Whether private information are shared or
not
Whether private information are transferred
or not
When and how private information will be
deleted

stage 2: Analyze Whether there are


any violations of the Act
After the master list of private information is
completed, the company should analyze its private
information and compare it with the Act. This
stage helps the company realize what problems
it has so far and brings more attention to privacy
protection. Most violations of privacy protection,
such as a hackers attack, are internal cases rather
than external cases. As a result, educating employees is one of the most important elements in
privacy protection. In order to implement a protection system, the company has to be aware of the
current situation regarding private information.
The following is a checklist that can be used to
analyze the companys current situation:

Specify how the private information will be


used

9

Online Privacy Protection in Japan

The scope of usage of private information


is not violated
The way the private information was collected is legal
Consumers are informed of the purpose of
the usage of the private information
An incident plan is prepared in the case of
a leak or loss of private information
Education for private information is implemented
Guidelines for outsourcing private information are prepared
Agreement of outsourcing private information has been received from consumers

stage 3: establish privacy policy


and make It public for All consumers
and employees
Since the Act requires informing consumers of
the purpose of the usage of private information
when it is collected, a company must have privacy
policy to let consumers know. Privacy policy
should include all legal obligations as defined
by the Act and future plan of the company. The
subjects which companies must include in the
privacy policy are as follows.

Privacy Policy Subjects

Purpose of usage of private information


Policy of data transfer to outsourced company
Policy of sharing private information
Policy of retained private data
Contact information of claim or advise

Once the company has a detailed privacy policy


done, it must be published on the Web site. Not
only is this process for satisfying requirements
of the Act and making the privacy policy public
and located at a place where consumers can be
easily access it, but it also allows consumers to
know the companys privacy policy before their
actual business with the company.
80

It is important that the company implements the


three plansphysical, internal, and externalto
maintain their privacy policy in effect.
In the physical plan, the company should secure
all information by access control, security control,
and removal control. First, it is vital to designate
who can access what kind of information in the
company and to keep updating the password and
access lists so the company can control access to
private information to prevent private data leaks.
For example, most Japanese companies are giving access authorities for private information to
all employees to make business run smoothly.
However, it increases the risk of a data leak. The
company must restrict access authority to protect
private information. The second regards security
control. Physical theft is always a serious concern
for protecting private information. The inventory
management of the equipment, especially for portable ones, such as laptop computers and USB flash
drives, is also important. The company should
restrict or prevent private information from being
copied to devices that could be taken outside of the
company. The last note is the removal of private
information. There are many cases that are related
to improper removal of private information that
was stolen and spread out via the Internet. The
company should establish a policy for removing
and storing private information.
The internal plan requires companies to have
an effective organizational system to educate
employees and establish employee working
rules to deal with private information. An effective organizational system is for designating
authority and responsibility and having effective
communication between executive managers and
employees so that top executives can observe how
private information is handled by each department and employee. Education for employees is
to make all employees aware of the Act in daily
business practices. It is important to educate
employees, because once a violation of privacy
protection occurs, the company needs to show
how much employees have been educated about

Online Privacy Protection in Japan

the Act. As listed, there are four steps to educate


employees.

cAse studIes
case I: yahoo! bbs data leak

Make all employees take classes about the


Act
Answer and solve all privacy-related inquires
in daily business activities
Test all employees knowledge on the Act
Establish a working manual and distribute
it to all employees

The establishment of working rules will help


to increase employees awareness of the risks associated with violation of the Act. It should include
rewards and punishments for violating a rule to
motivate and involve all employees to create secure
environment for private information.
The external plan is for contracting with
outsourced companies. In the Act, it requires the
company to monitor outsourced companies. If a
contracted company violates the Act, it will be
the companys responsibility. When a company
outsources its data entry or delivery task to another
company, it is required to have a specific and strict
policy for selecting a reliable contracted company.
It is a common business practice for a large firm
to outsource its non-core or minor tasks to smaller
companies. However, violations of the Act have
occurred in situations like this frequently.

stage 4: establish guidelines if an


violation of the Act occurs
Companies need to make good decisions and take
action to minimize the damage associated with
a violation of the Act. In order to make a correct
decision, the company should establish appropriate guidelines for an incident response of the
violation of privacy laws. The first response from
the companys stakeholders is typically the most
critical in this case because they would perceive
how the company is reacting. The company must
make sure they know exactly what happened and
how much the damage could cost before minimizing the secondary damage for consumers.

Yahoo! BB, a DSL service owned by Soft Bank,


Co. in Japan, had the private data of 4.5 million
people leaked because it was illegally accessed
from outside of the company in 2004. Yahoo!
BB is a one of the largest DSL Internet providers in Japan. Its service began in 2001. Since its
service was less expensive and broader than other
competitors, it gained more than one million
customers in 3 years (Sasahara, 2005a). However,
the massive privacy violation disaster occurred
because Yahoo! BB could not update their system
more quickly as the company grew, and it did not
take care of private information as an asset of
the company that was more prioritized than the
companys growth.
Yahoo! BBs private information was stored
in an internal database. This database was connected to the companys network so employees
could access it from outside of the company if they
had a remote access account and database access
account. According to Yahoo! BBs investigation
(Suzuki, 2004), there were 100 people who had
a remote or database access account, and 18 had
both remote and database access accounts. Yahoo!
BB had two types of account for each access,
group and personal. The data leak was started
by a person who had both accounts. This person
retired and his Yahoo! BB personal account was
terminated. However, the group account for both
remote and database access was not updated. As a
result, he could easily access the database, which
stored more than 4.5 million customers private
data, from outside of the company with his group
account. This data leak was not caused by this
person but another person who got the account
information from him. The group account mentioned in this case had not been changed since
2002 and was finally deleted in 2005 (Suzuki,
2004). The group account did not necessarily
need to be created, but was convenient when

8

Online Privacy Protection in Japan

conducting daily business activities. The loss of


managing access control was the critical catalyst
that triggered what became one of the largest
private data leaks in the world.
Yahoo! BB decided to physically separate the
customer database with private data from any
network and composed 684 security examinations, such as finger print activation and strict
access control, to prevent more private data leaks.
After all investigations were completed, Yahoo!
BB mailed gift cards worth $5 to all customers
whose private information had been leaked (Suzuki, 2004).

case II: michinoku banks customer


data loss
Michinoku Bank, located in Aomori, Japan lost
its 1.28 million customers private data and all
bank information in 2005. For security purposes,
the bank created backup CDs for all customers
private information, which were lost due to an
unknown reason. According to Michinokus press
release, these CDs may have been accidentally
thrown away with other trash, and it is possible
that the customers information will not fall into
someones hands (Imaio, 2005). However, the
conjecture could not be proved because no one
would be able to know the location of the CDs.
After this news was released, Michinoku Bank
received a correction warning and business
improvement order from the financial sector of
Japanese government. The reputation of the bank
completely collapsed and its stock price dropped
more than 30% and they lost $300 million (Imaio,
2005).
This case was only the tip of the iceberg. Since
the private information protection law was activated in 2005, similar problems were reported to
the government and 6.78 million peoples private
data was lost from 287 companies (Sasahara,
2005b). According to the survey, 75% decided
to provide security education on handling private data for all employees. Sixty-seven percent

8

reengineered business flow and 65% established


customer support centers.
It has been generally observed that increasing
employees security awareness and education on
the Act has become the most common practice.
Even though the online security of the financial
business has been more reliable, companies could
not prevent human related flaws because of poor
security education for their employees.

Analysis and Implication


Case I: The Yahoo! BB case describes how important a companys access control and organizational
reporting system are. As addressed in the previous
section, a company must establish its own privacy
policy. In order to implement its policy, Yahoo!
BB must conduct physical and internal controls
for all employees, including establishing access
control, restructuring its organizational system,
offering security training and education, and establishing employee working rules. These controls
were obviously absent in Yahoo! BBs case. The
following are the root causes of this case.

Access control: Yahoo! BB had little access


control over its private data. There was no
employee access authority list, and the access
logs were too small to record all unauthorized accesses.
Restructuring organizational system: Yahoo! BB did not have an efficient organizational system to control each department and
contracted company. There was no effective
communication between top executives and
employees. As a result, top executives did
not even know the existence of group accounts until they were identified by the major
investigation, and the group accounts had
not been updated for more than 3 years.
Security training and education: The employees did not have any training or education on the Act. They were not aware of the
risks of violating privacy protection.

Online Privacy Protection in Japan

Employee working rules: The working environment was almost unrestricted. There
was no rule for dealing with customers files.
Employees could copy any files to a personal
device if they could access the database.

Yahoo! BBs main focus was getting more


customers and more business to meet their fast
growing goal but along the way they sacrificed
their customers privacy protections. As a result,
the companys reputation significantly degraded
after the incident. The company should have created a privacy policy not only for customers but
also all employees. The core information should
not be able to be easily obtained by anyone. Security training and education should be provided
and employees should fully understand the Act
and their working rules.
In this case, there was no need to provide so
many group accounts for employees, since they
had not been changed for three years. Such accounts should be restricted and updated periodically (Kai, 2004).
Case II: The Michinoku Bank case pointed
out how important education for employees and
physical security control are. Since the Act was
put into effect in 2005, employees were not familiar with the law and they did not perceive the
impact of privacy breaches. In order to reduce this
security concern in daily business activities, the
company should provide quality security training
and education for all employees so they can work
more safely under the law. It is also important to
implement security control. As addressed in the
previous section, physical security control is part
of privacy protection. The Michinoku Bank had
high security protection against attacking from
outside. However, the privacy violation occurred
from inside of the company. This is why it is critical to create an employee privacy policy.
The Michinoku Bank did not have an organized physical security control. The most valuable
information was lost because of a human error.
The root causes are summarized.

Lack of security control: The bank did not


have any inventory control of private data.
They need to create a department to control
and restrict corporate physical inventory.
More than a million peoples private data
can easily be saved on a couple of CDs.
The company must pay more attention to
handling private data.
Lack of security training and education: Lack
of employee security training and education resulted in less attention on valuable
corporate assets, private data CDs, and so
forth. Even though the company has a good
security control policy, the security training
and education for all employees needs to be
provided.

In summary, two major privacy breaches occurred internally instead of externally. Both cases
exhibit major problems in the stage of establishing
privacy policy, which resulted from lack of access
control, physical security control, and security
training and education. The companies were not
fully compliant with the Act because it was the
first privacy law for commercial companies. However, their weak privacy protection had degraded
their businesss reputation. The companies were
required to report their privacy incidents to the
government according to the Act. Both companies
need to review their privacy protection system
and reorganize their system to prevent similar
incidents from happening again.

future trends
The year of 2005 was a major turning point for
all businesses in Japan. The first law on the protection of personal information for commercial
companies was launched. Companies were trying
to adapt to the new law. The Act on the Protection of Personal Information has brought new
perspective of privacy to all Japanese companies
and consumers. The stage of privacy protection in

8

Online Privacy Protection in Japan

Japan remains early. A learning curve to a mature


privacy system in Japan is expected.

conclusIon
For Japanese companies and consumers, it is
important to understand the details of their new
privacy law and make adjustments from the traditional privacy protection system to the new one.
The Act also helps reveal many privacy breaches
and violations. Many suits related to privacy are
expected. However, companies cannot escape
from the required changes since online privacy
has become a priority problem for consumers in
Japan.
Consumer online privacy is receiving more
and more attention because consumers are being educated to know the importance of private
information protection and the impact of privacy
breaches. For companies, protecting consumers
privacy has become a primary issue since it will
help gain their customers trust and business.
However, companies should take the responsibility
to not only enhance their online privacy protection
systems but also educate their employees and customers and incorporate the privacy concept into
Japanese culture toward a new stage. In e-commerce, online privacy can be protected by means
of the online privacy protection mechanisms such
as opt-in, privacy mark, and P3P. A privacy mark
seal program can help a company provide more
information in privacy policy for consumers. P3P
allows consumers to choose their privacy level by
themselves rather than let the company determine
the level of privacy that they will have. The opt-in
policy will allow consumers to select how their
private information is used online.
Weakness of privacy protection could cost
companies even more than what was exhibited
in the Yahoo! BB and Michinoku Bank cases.
When implementing a privacy protection system,
companies need to ensure the compliance of the
regulatory codes and legal obligations.

8

future reseArch dIrectIons


The Japanese privacy protection system has not
yet matured since the Act on the Protection of
Personal Information was launched in 2005.
Many adjustments and changes are expected. It
is suggested that future research pursue the following directions.

How does a company in Japan make adjustment in compliance with the new Act on the
protection of personal information?
How are consumers attentions shifting to
the new Act, which somewhat contradicts
Japanese culture? Japanese people respect
privacy in their culture but the new law
will stimulate their thinking and need for
privacy.
The longitudinal change of the Act should
be observed and analyzed. There is a lot of
room for the Act to be corrected or adjusted
as time goes on. Research can help establish
new models for the future privacy protection
system.

references
Act on the protection of personal information.
Cabinet Office, Government of Japan (2005).
http://www5.cao.go.jp/seikatsu/kojin/foreign/act.
pdf
Analysis of Japanese privacy protection technology (2005). Retrieved from New Media Develpment Association: http://www.nmda.or.jp/enc/privacy/pr_rep_2006.pdf
Analysis of new technology for privacy protection
system (2006, March). Retrieved from New Media Develpment Association: http://www.nmda.
or.jp/enc/privacy/pr_rep_2006.pdf
Arai, T. (2005). Outline of personal informtion
protection act. Information Management, 2-15.

Online Privacy Protection in Japan

Benassi, P. (1999). TRUSTe: An online privacy


seal program. Communications of the ACM,
42(2), 56-59.

Okamura, H. (2004). Privacy protection and rights


to use privacy information in secure environment.
Japan: Foundation Internet Association.

Bouckaert, J., & Degryse, H. (2005). Opt in versus


opt out: A free-entry analysis of privacy policies.
http://weis2006.econinfosec.org/docs/34.pdf

Purpose of Privacy mark system. (2007, April 9).


Retrieved from Privacy Mark: http://privacymark.
jp/privacy_mark/about/outline_and_purpose.
html

Cranor, L. F. (2001). Introduction to P3P. In L.


F. Cranor (Ed.), Web privacy with P3P. Oreylly.
Goto, T. (2006, March). B to B, C to C, C to B,
C to C. Retrieved from Next Generation Electronic Commerce Promotion Council of Japan:
http://www.ecom.jp/journal/2007/ECOM_Journal_2007.pdf
Goto, T. (2007, March). B to B, C to C, C to B,
C to C. Retrieved from Next Generation Electronic Commerce Promotion Council of Japan:
http://www.ecom.jp/journal/2006/ECOMJournal2006.pdf

Sasahara, E. (2005a, June 29). Security hole of


privacy information database. Retrieved from
IT pro: http://itpro.nikkeibp.co.jp/free/smbit/
smbit/20050629/163612/?ST=smb
Sasahara, E. (2005b, Sept. 29). Privacy data
lost in financial business. Retrieved from IT
pro: http://itpro.nikkeibp.co.jp/article/SMBIT/
20050928/221820/?ST=management
Suzuki, T. (2004, May 31). Overview of Yahoo! BB
data leak. Retrieved from IT pro, http://itpro.nikkeibp.co.jp/free/NC/NEWS/20040531/145132/

Imaio, K. (2005). Michinoku bank privacy data lost.


Retrieved from IT pro: http://itpro.nikkeibp.co.jp/
free/NIP/NIPCOLUMN/20050616/162853/

Syukunan, T. (2007). Analysis for efficiency of


anti-spam ware. Media Communications (Japanese), 57, 2-10.

Investigation result of data leak from Lawson. (2003). Retrieved from CNET Japan:
h t t p: //ja p a n .c n e t .c o m / n e w s / m e d i a /s t o ry/0,2000056023,20060378,00.htm

Tahara, M., & Yokohari, K. (2005). Warning of


usage of privacy information. Tokyo: Asa.

Kai, H. (2004). Comment from President


of Softbank Co. Retrieved from Internet
Watch: http://internet.watch.impress.co.jp/cda/
news/2004/02/27/2257.html
Kimery, K., & McCord, M. (2006). Signals
of trustworthiness in e-commerce: Consumer
understanding of third-party assurance seals.
Journal of Electronic Commerce in Organizations, 4(4), 52-74.
Mizutani, M., Dorsey, J., & Moor, J. (2004). The
internet and Japanese conception of privacy. Ethics
and Information Technology, 6(2), 121-128.
Nakada, M., & Tamura, T. (2005). Japanese conceptions of privacy: An intercultural perspective.
Ethics and Information Technology, 7(1), 27-36.

Yamazaki, F. (2005). Privacy protection law.


Nikkei BP.

AddItIonAl reAdIngs
Ackerman, M. (2004). Privacy in pervasive environment: next generation labeling protocols.
Personal and Ubiquitous Computing, 8(6),
430-439.
Agrawal, R., Kiernan, J., Srikant, R., & Xu, Y.
(2003). An XPath-based preference language for
P3P. In Proceedings of the Twelfth International
World Wide Web conference (WWW2003) (pp.
629-639).

8

Online Privacy Protection in Japan

Andrade, E., Kaltcheva, V., & Weitz, B. (2002).


Self-disclosure on the web: The impact of privacy
policy, reward, and company reputation. Advances
in Consumer Research, 29, 350-353.
Antn, A., & Earp, J. (2004). A requirements
taxonomy to reduce website privacy vulnerabilities. Requirements Engineering Journal,
9(3), 169-185.
Antn, A., He, Q., & Baumer, D. (2004). The complexity underlying JetBlues privacy policy violations. IEEE Security & Privacy, 2(6), 12-18.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy
policies: An empirical perspective on self-regulatory practices. Journal of Electronic Commerce
in Organizations, 3(4), 61-74.
Belanger, F., Hiller, J., & Smith, W. (2002) Trustworthiness in electronic commerce: The role of
privacy, security, and site attributes. Journal
of Strategic Information Systems, 11(3/4), 245270.

cryptography over noisy channels. Physics Review


Letters, 77(13), 2818-2821.
Eldin, A. A., & Wagenaar, R. (2004). Towards
users driven privacy control. IEEE International
Conference on Systems, Man and Cybernetics,
5, 4673-4679.
Faja, S. (2005). Privacy in e-commerce: Understanding user trade-offs. Issues in Information
Systems, 6(2), 83-89.
Fitzpatrick, A., Evansburg, A., Fiore, M., Watson,
V., & Welch, K. (2003). Japanese Parliament passes
legislation governing use of personal information. Intellectual Property & Technology Law
Journal, 15(8), 18.
Floridi, L. (2002). On the intrinsic value of information objects and the infosphere. Ethics &
Information Technology, 4(4), 287-304.
Gritzalis, D. (2004). Embedding privacy in IT applications development. Information Management
and Computer Security, 12(1), 8-16.

Brown, M., & Muchira, R. (2004). Investigating


the relationship between Internet privacy concerns
and online purchase behavior. Journal of Electronic Commerce Research, 5(1), 62-70.

Kawai, M. (2004). Anzen sinwa houkai no paradokkus (Paradox of the Collapse of Belief in Safety
in Japan). Iwanami, Tokyo.

Capurro, R. (2005). Privacy. An intercultural


perspective. Ethics and Information Technology,
7(1), 37-47.

Kudo, M., Araki, Y., Nomiyama, H., Saito, S.,


& Sohda, Y. (2007). Best practices and tools for
personal information compliance management.
IBM Systems Journal, 46(2), 235-254.

Cassidy, C., & Chae, M. (2006). Consumer information use and misuse in electronic business:
An alternative to privacy regulation. Information
Systems Management, 23(3), 75-87.

Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217-232.

Davison, R., Clarke, R., Smith, J., Langford,


D., & Kuo, B. (2003). Information privacy in a
globally networked society: Implications for IS
research. Communications of the Association for
Information Systems, 12, 341-365.
Deutsch, D., Ekert, A., Jozsa, R., Macchiavello,
C., Popescu, S., & Sanpera, A. (1996). Quantum
privacy amplification and the security of quantum

8

Mizutani, M., Dorsey, J., & Moor, J. (2004). The


Internet and Japanese Conception of Privacy. Ethics and Information Technology, 6, 121128.
Nakada, M., & Tamura, T. (2005). Japanese conceptions of privacy: An intercultural perspective.
Ethics and Information Technology, 7(1), 27.
Pavlou, P., & Fygenson, M. (2006). Understanding
and predicting electronic commerce adoption: An

Online Privacy Protection in Japan

extension of the theory of planned behavior. MIS


Quarterly, 30(1), 115-143.
Peslak, A. (2006). Internet privacy policies of
the largest international companies. Journal of
Electronic Commerce in Organizations, 4(3),
46-62.
Reichheld, F., & Schefter, P. (2000), E-loyalty:
Your secret weapon on the web. Harvard Business Review, 78(A), 105-113.
Rifkin, J. (2000). The age of access. New York:
Putnam.
Skinner, G., & Chang, E. (2006a). A conceptual
framework for information privacy and security
in collaborative environments. International Journal of Computer Science and Network Security,
6(2B), 166-172.

Skinner, G., & Chang, E. (2006b). A projection


of the future effects of quantum computation on
information privacy and information security.
International Journal of Computer Science and
Network Security, 6(8B), 189-194.
Yasu, K., Akahane, Y., Ozaki, M., Semoto, K., &
Sasaki, R. (2005). Evaluation of check system for
improper sending of personal information in encrypted mail system. PSJ (Information Processing
Society of Japan) Journal, 46(8), 1976-1983.
Zhang, X., Sakaguchi, T., & Kennedy, M. (2007).
A cross-cultural analysis of privacy notices of the
global 2000. Journal of Information Privacy &
Security, 3(2), 18-37.

8

88

Compilation of References

42 C.F.R. 482.51
42 U.S.C. 290dd-2 (2006).
Abercrombie, N., & Longhurst, B. (1998). Audiences:
A sociological theory of performance and imagination.
CA: Sage Publication.
Aberdeen Group. (2005). Third brigadebusiness
value research seriesmost important security action: Limiting access to corporate and customer data.
Whitepaper. Retrieved October 2007, from http://www.
thirdbrigade.com/uploadedFiles/Company/Resources/
Aberdeen%20White%20Paper%20--%20Limiting%20
Access%20to%20Data.pdf
Aberer, K., & Despotovic, Z. (2001). Managing trust in
a peer-2-peer information system. In Proceedings of the
2001 ACM CIKM International Conference on Information and Knowledge Management, Atlanta, Georgia (pp.
310-317). New York: ACM.
Act on the protection of personal information. Cabinet
Office, Government of Japan (2005). http://www5.cao.
go.jp/seikatsu/kojin/foreign/act.pdf
Agrawal, D., & Aggarwal, C. (2001). On the design
and quantification of privacy preserving data mining
algorithms. In Proceedings of the Twentieth ACM SIGMOD-SIGACT-SIGART Symposium on Principles of
Database Systems, PODS01, Santa Barbara, California
(pp. 247-255). New York: ACM.
Aiken, K., & Boush, D. (2006). Trustmarks, objectivesource ratings, and implied investments in advertising:
Investigating online trust and the context-specific nature
of internet signals. Academy of Marketing Science Journal, 34(3), 308-323.

Air Defense Press Release. (2005, February 17). AirDefense monitors wireless airwaves at RSA 2005 conference. Retrieved October 2007, from http://airdefense.
net/newsandpress/02_07_05.shtm
Akenji, L. (2004). The eight basic consumer rights.
Retrieved November 8, 2006, from http://www.tudatosvasarlo.hu/english/article/print/254
Allen, A. (2000). Gender and privacy in cyberspace.
Stanford Law Review, 52(5), 1175-1200.
Allens Arthur Robinson. (2007). International data
flow. Retrieved October 2, 2007, from www.aar.com.
au/privacy/over/data.htm
Ambrose, S., & Gelb, J. (2006). Consumer privacy litigation and enforcement actions in the United States. The
Business Lawyer, 61, 2.
American Board of Medical Specialties. (2007). Retrieved
June 1, 2007 from http://www.abms.org
American Civil Liberties UnionACLU. (2007). Privacy
section of their web site. Retrieved July 12, 2007, from
http://www.aclu.org/privacy
American Heritage. (2006). American Heritage Dictionary of the English Language (4th ed.). Retrieved April
16, 2007, from http://dictionary.reference.com/browse/
copyright
American Management Association. (2005). Electronic
monitoring & surveillance survey. Retrieved October
2007, from http://www.amanet.org/research/pdfs/
EMS_summary05.pdf

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Compilation of References

American Medial Association. (2007). Informed consent.


Retrieved June 1, 2007, from http://www.ama-assn.
org/ama/pub/category/4608.html
Analysis of Japanese privacy protection technology
(2005). Retrieved from New Media Develpment Association: http://www.nmda.or.jp/enc/privacy/pr_rep_2006.
pdf
Analysis of new technology for privacy protection system
(2006, March). Retrieved from New Media Develpment Association: http://www.nmda.or.jp/enc/privacy/
pr_rep_2006.pdf
Anassutzi, M. (2002). E-commerce directive 00/31.
International Company and Commercial Law Review,
13(9), 337-342.
Anderson, R. (1996). The eternity service. In Proceedings of the 1st International Conference on the Theory,
Applications of Cryptology, PRAGOCRYPT96.
APEC Electronic Commerce Steering Group. (2004).
APEC voluntary online consumer protection guidelines.
Retrieved March 1, 2005, from http://www.export.gov/
apececommerce/cp/guidelines.htm
Apfelroth, J. (2006). The open government act: A proposed bill to ensure the efficient implementation of the
freedom of information act. Administrative Law Review,
58(1), 219.
Apple. (2007). iTunes [software]. Retrieved December
9, 2007, from http://www.apple.com/itunes/
Arai, T. (2005). Outline of personal informtion protection
act. Information Management, 2-15.
Arbaugh, J. B. (2000). An exploratory study of effects
of gender on student learning and class participation in
an internet-based MBA course. Management Learning,
31(4), 503-519.
Armstrong, H. L., & Forde, P. G. (2003). Internet anonymity practices in computer crime. Information Management
& Computer Security, 11(5), 209-215.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy policies:
An empirical perspective on self-regulatory practices.

Journal of Electronic Commerce in Organizations,


3(4), 61-74.
Ashworth, L., & Free, C. (2006). Marketing dataveillance and digital privacy: Using theories of justice to
understand consumers online privacy concerns. Journal
of Business Ethics, 67, 07-123.
Ashworth, L., & Free, C. (2006). Marketing dataveillance and digital privacy: Using theories of justice to
understand consumers online privacy concerns. Journal
of Business Ethics, 67, 107-123.
Associated Press. (2005). Business schools: Harvard
to bar 119 applicants who hacked admissions site. The
Globe and Mail, March 9, B12.
Auerbach, J. G. (1999, March 31). To get IBM ad, sites
must post privacy policies. The Wall Street Journal,
pp. B1, B4.
Australian Communications and Media Authority.
(2005). Consumer information. Anti spamfighting
spam in Australia: Consumer information. Retrieved
July 22, 2005, from http://www.acma.gov.au/ACMAINTER.2163012:STANDARD:848603301:pc=PC_
1965#anti%20spam%20law
Australian Competition and Consumer Commission.
(1999). ACCC conditionally authorises ADMA code
of practice. Retrieved March 27, 2007, from http://
www.accc.gov.au/content/index.phtml/itemId/322914/
fromItemId/621589
Australian Competition and Consumer Commission.
(2003). Review of building consumer sovereignty in
electronic commerce (best practice model). Retrieved
November 11, 2006, from http://www.ecommerce.
treasury.gov.au/bpmreview/content/_download/submissions/accc.rtf
Australian Competition and Consumer Commission.
(2004). Annual report 2003-2004fostering competitive, efficient, fair and informed Australian markets.
Canberra, ACT: Australian Competition and Consumer
Commission.
Australian Consumers Association. (2004). Submission
to the review of the private sector provisions of the privacy

89

Compilation of References

act 1988 (Cth) (the privacy act) Sydney, NSW: Office of


the Privacy Commissioner (Australia).

November 11, 2006, from http://www.eff.org/news/archives/2006_02.php

Australian Direct Marketing Association (ADMA).


(2005). ADMA profile. Retrieved August 17, 2005, from
http://www.adma.com.au/asp/index.asp?pgid=2026

Baratz, A., & McLaughlin, C. (2004). Malware: what it


is and how to prevent it. Retrieved November 11, from
http://arstechnica.com/articles/paedia/malware.ars

Australian Federal Police (AFP). (2007). Internet fraud.


Retrieved March 16, 2007, from http://www.afp.gov.
au/national/e-crime/internet_scams

Barbaro, M. (2006, March 7). Wal-Mart enlist bloggers


in P.R. campaign. The New York Times.

Australian Institute of Criminology. (2006). More Malwareadware, spyware, spam and spim. High Tech
Crime Brief, 1(2006), 1-2.
Australian Privacy Foundation. (2005). Rule 3objectives and purposes. Retrieved April 2, 2007, from
http://www.privacy.org.au/About/Objectives.html
Australian Privacy Foundation. (2006). Identity checks
for pre-paid mobile phones. Retrieved April 2, 2007,
from http://www.acma.gov.au/webwr/_assets/main/
lib100696/apf.pdf
Australian Privacy Foundation. (2006). International
instruments relating to privacy law. Retrieved February
23, 2007, from http://www.privacy.org.au/Resources/
PLawsIntl.html
Awad, N. F., & Krishnan, M. S. (2006). The personalization privacy paradox: An empirical evaluation of information transparency and the willingness to be profiled online
for personalization. MIS Quarterly, 30(1), 13-28.

Barnes, G. R., Cerrito, P. B., & Levi, I. (1998). A mathematical model for interpersonal relationships in social
networks. Social Networks, 20(2), 179-196.
Barreto, M., & Ellemers, N. (2002). The impact of anonymity and group identification on progroup behavior
in computer-mediated groups. Small Group Research,
33, 590-610.
Basili, J., Sahir, A., Baroudi, C., & Bartolini, A. (2007,
January). The real cost of enterprise wireless mobility
(Abridged ed.). The Aberdeen Group. Retrieved October
2007, from http://www.aberdeen.com/summary/report/
benchmark/Mobility_Management_JB_3822.asp
Baumer, D. L., Earp, J. B., & Poindexter, J. C. (2004).
Internet privacy law: a comparison between the United
States and the European Union. Journal of Computers
& Security, 23(5), 400-412.
Baylor, K. (2006, October 26). Killing botnets McAfee.
Retrieved March 2007, from http://blogs.techrepublic.
com.com/networking/?cat=2

Bagozzi, R. (1975). Marketing as exchange. Journal of


Marketing, 39(4), 32-39.

BBBOnline. (2003). Dispute resolution. Retrieved July 19,


2006, from http://www.bbbonline.org/privacy/dr.asp

Bandler, R. (2000). Persuasion engineering. Capitola,


CA: Meta Publications.

BBBOnline. (2005). European Union/US safe harbor


compliance. Retrieved March 25, 2005, from http://www.
bbbonline.org/privacy/eu.asp

Bang v. Miller, 88 N.W. 186 (Minn. 1958). Retrieved


June 1, 2007, from LexisNexis, Cases, Law School
database.
Bangeman, E. (2006). Dell growth rate slips behind
market. Retrieved July 20, 2006, from http://arstechnica.
com/news.ars/post/20060420-6640.html
Bankston, K. (2006, February 9). Press releases: February, 2006 | electronic frontier foundation. Retrieved

90

BBBOnLine. (2007). Retrieved December 9, 2007, from


http://www.bbbonline.org/
Beck, J., & Herrmann, M. (2007). Drug and device
law: HIPAA does not preempt state litigation practice.
Retrieved July 2, 2007, from http://druganddevicelaw.
blogspot.com/2007/02/hipaa-does-not-preempt-statelitigation.html

Compilation of References

Belanger, F., Hiller, J. S., & Smith, W. J. (2002). Trustworthiness in electronic commerce: The role of privacy,
security, and site attributes. Journal of Strategic Information Systems, 11(3-4), 245-270.
Belch, G. E., & Belch, M. A. (2004). Advertising and
promotion: An integrated marketing communications
perspective (6th ed.). New York: McGraw-Hill/Irwin.
Bellman, S., Johnson, J. J., Kobrin, S. J., & Lohse, L. L.
(2004). International differences in information privacy
concerns: A global survey of consumers. The Information
Society, 20, 313-324.
Beltramini, R. F. (2003). Application of the unfairness
doctrine to marketing communications on the Internet.
Journal of Business Ethics, 42(4), 393-400.
Benassi, P. (1999). TRUSTe: An online privacy seal program. Communications of the ACM, 42(2), 56-59.
Bennett, C. (2006). Keeping up with the kids. Young
Consumers, Quarter 2, 28-32.
Bennett, C. J. (1992). Regulating privacy. Ithaca, NY:
Cornell University Press.
Berendt, B., Gnther, O., & Spiekermann, S. (2005).
Privacy in e-commerce: Stated preferences vs. actual
behavior. Communications of the ACM, 48(4), 101-106.
Bernard, A. (2006). McAfees top ten security threats
for 2007. Retrieved October, from http://www.cioupdate.
com/print.php/3646826
Berner, R. (2006, May 29). I sold it through the grapevine.
Business Week, pp. 32-33.
Bhargava, B. (2006, September). Innovative ideas in
privacy research (Keynote talk). In Proceedings of the
Seventeenth International Workshop on Database and
Expert Systems Applications DEXA06, Krakw, Poland (pp. 677-681). Los Alamitos, CA: IEEE Computer
Society.
Bhargava, B., & Zhong, Y. (2002). Authorization based
on evidence and trust. In Y. Kambayashi, W. Winiwarter,
& M. Arikawa (Eds.), Proceedings of 4th International
Conference on Data Warehousing and Knowledge

Discovery (DaWaK 2002) Lecture Notes in Computer


Science (Vol. 2454, pp. 94-103). Heidelberg, Germany:
Springer.
Bhargava, B., Farkas, C., Lilien, L., & Makedon, F. (2003,
September 14-16). Trust, privacy, and security: summary
of a workshop breakout session at the national science
foundation information and data management (IDM)
workshop held in Seattle, Washington (Tech. Rep. No.
2003-34). West Lafayette, IN: Purdue University, Center
for Education and Research in Information Assurance
and Security (CERIAS).
Bhargava, B., Lilien, L., Rosenthal, A., & Winslett, M.
(2004). Pervasive trust. IEEE Intelligent Systems, 19(5),
74-77.
BJs Wholesale Club settles FTC charges. (2005). Retrieved from http://www.ftc.gov/opa/2005/06/bjswholesale.htm
Blaze, M., Feigenbaum, J., & Lacy, J. (1996). Decentralized trust management. In Proceedings 1996 IEEE
Symposium on Security and Privacy (pp. 164-173).
Bloom, I. (2006). Freedom of information law in the digital
age: The death knell of informational privacy. Richmond
Journal of Law & Technology, 12(Spring), 9.
BMG Canada Inc. v. John Doe, Volume Number 858,
(2005).
Bogozza, D. (2006). WHO and partners accelerate fight
against counterfeit medicines. PR Newswire US. Retrieved June 5, 2007, from LexisNexis News
Bommareddy v. Superior Court, 222 272 Cal. Rptr. 246
(Cal. Ct. App.1990). Retrieved from LexisNexis Cases,
Law School database.
Borland, J. (2006). Apples iTunes raises privacy concerns.
CNet News. Retrieved July 7, 2007, from http://news.com.
com/Apples+iTunes+raises+privacy+concerns/21001029_3-6026542.html
Bouckaert, J., & Degryse, H. (2005). Opt in versus opt out:
A free-entry analysis of privacy policies. http://weis2006.
econinfosec.org/docs/34.pdf

9

Compilation of References

Boyer, M. C. (2004). Texas administrative agencies tackle


compliance with the health insurance portability and
accountability acts privacy rule. Texas Tech Journal of
Administrative Law, 5, 100-111.
Brands, S. (2000). Rethinking public key infrastructures
and digital certificates. Cambridge, MA: The MIT
Press.
Brandt, D. (n.d.). Google as big brother. Retrieved November 11, 2006, from http://www.google-watch.org
Brazell, L. (2004). Electronic signatures law and regulation (1st ed.). London: Sweet & Maxwell.
Brenner, S. (2001). Is there such a thing as virtual
crime? Retrieved February 1, 2006,from http://www.
crime-research.org/library/Susan.htm
Brenner, S. W. (2005). The search and seizure of computers and electronic evidence: The fourth amendment
in an era of ubiquitous technology. Mississippi Law
Journal, 75(Fall), 1.
Brenner, S. W., & Clark, L. L. (2006). Fourth amendment
protection for shared privacy rights in stored transactional
data. Journal of Law and Policy, 14, 211.
Brooke, J., & Robbins, C. (2007). Programmer gives up
all the money he made distributing spyware. Retrieved
from http://www.ftc.gov/opa/2007/02/enternet.htm
Brown, J. (1996). Australia and the modern consumer
movement. A History of the Australian Consumer Movement (pp. 1-6). Braddon, ACT: Consumers Federation
of Australia.
Brunner, C. (1991). Gender and distance learning. Annals of the American Academy of Political and Social
Science, 133-145.

European consumer legislation information society.


Journal of Consumer Policy, 24(3/4), 287-338.
C. Martin (Ed.), Websters new world hacker dictionary
(pp. 373-380). Indianapolis, IN: Wiley Publishing, Inc.
Cafarchio, P. (2004). The challenge of non-viral malware!
TISC Insight Newsletter, 4(12). Retrieved October 2007,
from www.pestpatrol.com/Whitepapers/NonViralMalware0902.asp
Cai, X., & Gantz, W. (2000). Online privacy issues associated with web sites for children. Journal of Broadcasting
& Electronic Media, 44(2), 197-214.
Cal. Civ. Code 56-56.37 (LexisNexis 2007).
Campbell, J., & Carlson, M. (2002). Panopticon.com:
Online surveillance and the commodification of privacy.
Journal of Broadcasting & Electronic Media, 46(4),
586-606.
Can Google diagnose illness better than doctors? (Nov.
6, 2006). The Daily Mail (London). Retrieved June
4, 2007, from http://www.seroundtable.com/archives
/006667.html
Can, A. B. (2007). Trust and anonymity in peer-to-peer
systems. Ph.D. thesis, West Lafayette, Indiana: Purdue
University.
Cannon, D. M., & Kessler, L. (2007). Dangercorporate data breach! Journal of Corporate Accounting &
Finance, 18(5), 41-49.
CAN-SPAM Act of 2003, S.877, 108th Congress,
(2003).
Capitol v. Does 1-16, 07-485, WJ/LFG.
Capurro, R. (2005). Privacy. An intercultural perspective.
Ethics and Information Technology, 7, 37-47.

Bumgarner, J., & Borg, S. (2007). The US-CCU cyber


security check list. Retrieved November 2007, from
http://www.usccu.us/documents/US-CCU%20CyberSecurity%20Check%20List%202007.pdf

Carbo, J., Molina, J., & Davila, J. (2003). Trust management through fuzzy reputation. International Journal of
Cooperative Information Systems, 12(1), 135155.

Buning, M. D. C., Hondius, E., Prins, C., & Vries, M.


D. (2001). Consumer@Protection.EU. An analysis of

Carr, C. A., Erickson, G. S., & Rothberg, H. N. (2004),


Intellectual capital, competitive intelligence and the eco-

9

Compilation of References

nomic espionage act. International Journal of Learning


and Intellectual Capital, 1(4), 460-482.
Carr, D. (2006, July 6). How Google works. Retrieved
November 17, 2006, from http://www.baselinemag.
com/article2/0,1397,1985040,00.asp
Cate, F. H., Fields, D. A., & McBain, J. K. (1994). The
right to privacy and the publics right to know: The
central purpose of the freedom of information act.
Administrative Law Review, 46(Winter), 41-74.
Caudill, E. M., & Murphy, P. E. (2000). Consumer online
privacy: Legal and ethical issues. Journal of Public Policy
& Marketing, 19(1), 7-19.
CDT. (2005). Center for democracy and technology.
Legislation Center. Retrieved March 25, 2005, from
http://cdt.org/legislation/
Centeno, C. (2002). Building security and consumer trust
in internet payments: The potential of soft measure.
Seville, Spain: Institute for Prospective Technological
Studies.
CERT. (2007). Vulnerability remediation statistics.
Retrieved November 2007, from http://www.CERT.
org/stats/vulnerability_remediation.html
Chamness, J. N. Liability of physicians for communicating
over the internet. Retrieved June 1, 2007, from http://
www.Cornelius-collins.com/CM/Whats New/asp
Chan, P. (2003, September). The practical effect of privacy
laws on the global business and global consumer. Paper
presented at the 25th International Conference of Data
Protection and Privacy Commissioners, Sydney, NSW.
Chang, J. (2006). Is Facebook private? Northwestern
Chronicle. Retrieved October 2, 2007, from http://www.
chron.org/tools/viewart.php?artid=1346
Charney R., & Greenberg, B. S. (2002). Uses and gratifications of the Internet. In C. Lin & D. Atkin (Eds.),
Communication, technology and society: New media
adoption and uses. Cresskill, NJ: Hampton Press.
Chaum, D. (1981). Untraceable electronic mail, return
addresses, and digital pseudonyms. Communications of
the ACM, 24(2), 84-88.

Chaum, D. (1985). Security without identification:


Transaction systems to make big brother obsolete. Communications of the ACM, 28(10), 1030-1044.
Chaum, D. (1988). The dining cryptographers problem:
unconditional sender and recipient untraceability. Journal
of Cryptology, 1(1), 65-75.
Chaum, D., Fiat, A., & Naor, M. (1990). Untraceable
electronic cash. In S. Goldwasser (Ed.), Advances in Cryptology CRYPTO88 (pp. 319-327). Springer Verlag.
Chen, K., & Rea, A. (2004). Protecting personal information online: A survey of user privacy concerns and
control techniques. The Journal of Computer Information
Systems, 44(4), 85-93.
Chen, K., & Rea, A. (2004). Protecting personal information online: A survey of user privacy concerns and
control techniques. Journal of Computer Information
Systems, 44(4), 85-92.
Chen, K., & Rea, A. (2004). Protecting personal information online: A survey of user privacy concerns and
control techniques. Journal of Computer Information
Systems, 44(4), 85-92.
Chen, K., & Rea, A. I. J. (2004). Protecting personal
information online: A survey of user privacy concerns
and control techniques. Journal of Computer Information
Systems, 44(4), 85-92.
Chen, Y., & Barnes, S. (2007). Initial trust and online
buyer behaviour. Industrial Management & Data Systems, 107(1), 21-36.
Cheng, T. S. L. (2004). Spam regulation: Recent international attempts to can spam. Computer Law & Security
Report, 20(6), 472-479.
Chesley, N. (2006). Families in a high-tech age: Technology usage patterns, work and family correlates, and
gender. Journal of Family Issues, 27(5), 587-608.
Chester, J. (2006, March 26). Googles wi-fi privacy ploy.
Retrieved November 14, 2006, from www.thenation.
com/doc/20060410/chester
Chiang, M., & Starren, J. (2002). Telemedicine and
HIPAA: Data confidentiality and HIPAA. Retrieved May

9

Compilation of References

23, 2007, from http://www.ideatel.org/syllabus/hipaa.


html

Retrieved March 27, 2007, from http://www.anu.edu.


au/people/Roger.Clarke/DV/PActOECD.html

Childrens Online Privacy Protection Act. (2000). Enacted April 22, 2000. Retrieved from http://www.epic.
org/privacy/kids/

Clark, R. (1994). Human identification in information


dystems: Management challenges and public policy issues. Information Technology and People, 7(4), 6-37.

Cho, H., & LaRose, R. (1999). Privacy issues in Internet surveys. Social Science Computer Review, 17(4),
421-434.

Clarke, I., Sandberg, O., Wiley, B., & Hong, T. W. (2000).


Freenet: A distributed anonymous information storage
and retrieval system. In H. Federrath (Ed.), Workshop
on Design Issues in Anonymity and Unobservability (pp.
4666). Springer Verlag.

Choice. (2006). The eight basic consumer rights. Retrieved November 5, 2006, from http://www.choice.com.
au/viewArticle.aspx?id=100736&catId=100528&tid=100
008&p=1&title=The+eight+basic+consumer+rights
Chor, B., Fiat, A., Naor, M., & Pinkas, B. (2000). Tracing traitors. IEEE Transactions on Information Theory,
44(3), 893-910.
Chou, C., & Hsiao M. C. (2000). Internet addiction, usage,
gratification, and pleasure experience: the Taiwan college
students case. Computers & Education,35, 65-80.
Christ, R. E., Berges, J. S., & Trevino, S. C. (2007). Social
networking sites: To monitor or not to monitor users and
their content? Intellectual Property & Technology Law
Journal, 19(7), 13-17.
Chua, S. L., Chen, D.T., & Wong, A. F. L. (1999). Computer
anxiety and its correlates: A meta-analysis. Computers
in Human Behaviors, 15, 609-623.
Cincu, J., & Richardson, R. (2006). Virus attacks named
leading culprit of financial lossby U.S. companies in
2006 CSI/FBI computer crime and security survey.
Retrieved July 13, 2006, from http://www.gocsi.com/
press/20060712.jhtml
Claburn, T. (2004). Dell believes education is best way to
fight spyware. InformationWeek, October 20. Retrieved
September 30, from http://www.informationweek.com/
showArticle.jhtml;jsessionid=GHVMAU4IX1LXGQS
NDLOSKHSCJUNN2JVN?articleID=50900097&que
ryText=Dell+Believes+Education+Is+Best+Way+To+F
ight+Spyware
Clark, R. (1989). The Australian privacy act 1988 as an
implementation of the OECD data protection guidelines.

9

Clarke, R. (1998). Direct marketing and privacy. Retrieved March 24, 2007, from http://www.anu.edu.
au/people/Roger.Clarke/DV/DirectMkting.html
Clarke, R. (1999). Internet privacy concerns confirm
the case for intervention. Communications of the ACM,
42(2), 60-67.
Clarke, R. (1999). Internet privacy concerns confirm
the case for intervention. Communications of the ACM,
42(2), 60-67.
Class FPR: Privacy. (1999). Common criteria for
information technology security evaluation. Part 2:
Security functional requirements. Version 2.1. (Report
CCIMB-99-032) (pp.109-118). Ft. Meade, MD: National
Information Assurance Partnership (NIAP). Retrieved
June 5, 2007, from http://www.niap-ccevs.org/cc-scheme/
cc_docs/cc_v21_part2.pdf
Clayton, G. (2000). Privacy evaluation: Dell. Retrieved
July 20, 2006, from http://www.informationweek.com/
privacy/dell.htm
Clearswift. (2006 October). Simplifying content securityensuring best-practice e-mail and web use. The
need for advanced, certified email protection. Retrieved
October 2007, from http://whitepapers.zdnet.com/whitepaper.aspx?&scid=280&docid=271750
ClickZ. (2003). Users still resistant to paid content.
Jupitermedia, April 11. Retrieved April 16, 2007, from
http://www.ecommerce-guide.com/news/news/print.
php/2189551

Compilation of References

Cline, J. (2003). The ROI of privacy seals. Retrieved


October 12, 2007, fromhttp://www.computerworld.com/
developmentopics/websitemgmt/story/0,10801,81633,00.
html
CNET Staff. (2004, September). Spam volume keeps
rising. Retrieved September 2007, from http://news.com.
com/2114-1032-5339257.html
CNN. (2007). Google privacy Worst on the Web. CNN.
com. Retrieved July 10, 2007, from http://www.cnn.
com/2007/TECH/internet/06/11/google.privacy.ap/
Coalition Against Unsolicited Bulk Email (Australia).
(2002). Spam volume statistics. Retrieved June 2, 2007,
from http://www.caube.org.au/spamstats.html
Cochetti, R. J. (2007, June). Testimony of the computing
technology industry association (CompTIA), before the
house small business committee subcommittee on finance
and tax, sata security: Small business perspectives.
Retrieved October 2007, from www.house.gov/SMBiz/
hearings/hearing-06-06-07-sub-data/testimony-06-0607-compTIA.pdf
Cofta, P. (2006). Impact of convergence on trust in ecommerce. BT Technology Journal, 24(2), 214-218.
Cohen, J. (2003). DRM and privacy. Communications
of the ACM, 46(4), 47-49.
Cohoon, J. M., & Aspray, W. (2006). A critical review
of the research on womens
Colgan, N. (2003) Ireland: Electronic commerce directiveimplementation into Irish law. International Trade
Law and Regulation, 9(2).
Collberg, C., & Thomborson, C. (2002). Watermarking,
tamper-proofing, and obfuscation-tools for software
protection. IEEE Transactions on Software Engineering, 28(8), 735-746.
Computing Technology Industry Association. (2004).
Annual study. Retrieved October 2007, from http://www.
joiningdots.net/library/Research/statistics.html
Congress of the United States. (2001). USA PATRIOT
ACT of 2001. Retrieved July 16, 2007, from http://thomas.
loc.gov/cgi-bin/bdquery/z?d107:H.R.3162

Congress Passes Safe Web Act 2006. (2007). Retrieved


January 31, 2007, from http://www.epic.org
Consumer Affairs Victoria. (2003). Commonwealth
website guidelines ignored. Retrieved November
16, 2006, from http://www.consumer.vic.gov.au/
CA256F2B00231FE5/Print/C3DCDCFFC3DBD8EEC
A256F54000412C4?OpenDocument
Consumer Affairs Victoria. (2004). Online shopping
and consumer protection. Discussion paper, Melbourne,
Victoria: Standing Committee of Officials of Consumer
AffairsE-commerce Working Party, Consumer Affairs Victoria.
Consumer fraud and identity theft complaint data, January-December 2005. (2006). Retrieved June 1, 2007, from
http://www.ftc.gov/bcp/edu/microsites/idtheft/downloads/clearinghouse_2005.pdf
Consumer Protection Commission, E. Y. T. (undated). Ecommerce: APEC voluntary online consumer protection
guidelines. Retrieved April 3, 2007, from http://www.
cpc.gov.tw/en/index.asp?pagenumber=25
Consumers International. (2001). Should I buy? Shopping
online 2001: An international comparative study of electronic commerce. London: Consumers International.
Consumers International. (2004). Annual report 2004.
London: Consumers International.
Consumers International. (2006). World consumer
rights day. Retrieved November 7, 2006, from http://
www.consumersinternational.org/Templates/Internal.
asp?NodeID=95043&int1stParentNodeID=89651&int
2ndParentNodeID=90145
Controlling the Assault of Non-Solicited Pornography
and Marketing Act of 2003, 15 U.S.C. 7706.
Coonan, H. (2005, February). 10 years on. 10 years
strong. The internet in Australia. Paper presented at
the 2005 Internet Industry Association Annual Dinner,
Sydney, NSW.
Cooper, K. B., & Victory, N. J. (2002, February). A nation online: How Americans are expanding their use
of the internet. Washington, DC: U.S. Department of
Commerce.
9

Compilation of References

COPPA FAQs. (n.d.) Retrieved from http://www.ftc.


gov/privacy/coppafaqs.htm

December 10, 2000, from http://www.msb.edu/faculty/


culnanm/gippshome.html

COPPA protects children but challenges lie ahead. (2007).


Retrieved from http://www.ftc.gov/opa/2007/02/copparpt.htm

Culnan, M. (1999). Privacy and the top 100 web sites:


Report to the federal trade commission. Retrieved Dec
10, 2000, from http://www.msb.edu/faculty/culnanm/
gippshome.html

Corbett, R. (1993). The treaty of Maastricht. London:


Longman.
Cover, T., & Thomas, J. (1991). Elements of information
theory. Hoboken, NJ: John Wiley & Sons.
Cox, I., Miller, M., & Bloom, J. (2002). Digital watermarking: Principles & practice. San Diego, CA:
Academic Press.
Cox, J. (2007, February 9). RSA: attendees drop ball on
wi-fi securitymany IT security experts at conference
used unsecured devices. Network World. Retrieved
October 2007, from http://www.networkworld.com/
news/2007/020907-rsa-wifi-security.html
Craig, P., & de Brca, G. (2007) EU law text cases & materials (4th ed.). Oxford, UK: Oxford University Press.
Cranor, L. F. (2001). Introduction to P3P. In L. F. Cranor
(Ed.), Web privacy with P3P. Oreylly.
Cranor, L. F. (2003). P3P: making privacy policies more
useful. IEEE Security and Privacy, 1(6), 5055.

Culnan, M. (2000). Protecting privacy online: Is selfregulation working? Journal of Public Policy & Marketing, 19(1), 20-26.
Culnan, M. J. (1995). Consumer awareness of name
removal procedures: Implications for direct marketing.
Journal of Direct Marketing, 7, 10-19.
Culnan, M. J. (1999). The Georgetown Internet privacy
policy survey: Report to the Federal Trade Commission. Retrieved April 15, 2005, from http://www.msb.
edu/faculty/culnanm/gipps/gipps1.pdf
Culnan, M. J. (2000). Protecting privacy online: Is
self-regulation working? Journal of Public Policy &
Marketing, 19(1), 20-26.
Culnan, M., & Armstrong, P. (1999), Information privacy concerns, procedural fairness, and impersonal
trust: An empirical evidence. Organization Science,
10(1), 104-115.

Cranor, L. F., Reagle, J., & Ackerman, M. S. (1999). Beyond concern: Understanding net users attitudes about
online privacy (Tech. Rep. No. TR 99.4.3). Middletown,
NJ: AT&T Labs-Research. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/cranor99beyond.html

Curtis, K. (2005, September). The importance of selfregulation in the implementation of data protection
principles: the Australian private sector experience.
Paper presented at the 27th International Conference of
Data Protection and Privacy Commissioners, Montreux,
Switzerland.

Cravens, A. (2004), Speeding ticket: The U.S. residential broadband market by segment and technology.
In-Stat/MDR.

Curtis, K. (2005, March). Privacy in practice. Paper


presented at the Centre for Continuing Legal Education,
University of NSW, Sydney.

Crime and Misconduct Commission Queensland. (2004).


Cyber trapsan overview of crime, misconduct and
security risks in the cyber environment. Queensland:
Crime and Misconduct Commission.

Cyberdocs today announced the rirst virtual doctors


[sic] office on the world wide web (1996). Westlaw
M2 Presswire. Retrieved June 1, 2007 from 1996 WL
11276562

Culnan, M. (1999). Georgetown Internet privacy policy


survey: Report to the federal trade commission. Retrieved

Cyota. (2005). Cyota online fraud survey. Retrieved


April 7, 2006, from http://www.cyota.com/press-releases.
asp?id=78

9

Compilation of References

DAstous, A. (2000). Irritating aspects of the shopping environment. Journal of Business Research, 49, 149-156.
Dalal, R. S. (2006). Chipping away at the constitution:
The increasing use of RFID chips could lead to an erosion of privacy rights. Boston University Law Review,
86(April), 485.
Daub, T. R. (2001). Surfing the net safely and smoothly:
A new standard for protecting personal information
from harmful and discriminatory waves. Washington
University Law Quarterly, 79, 913-949.
Davies, S. (2004, February). The loose cannon: An
overview of campaigns of opposition to national identity
card proposals. Paper presented at the Unisys seminar:
e-ID: Securing the mobility of citizens and commerce
in a Greater Europe, Nice.
Day, J. (1999, February 22). Report: Teen online spending increases. Retrieved July 12, 2007, from http://www.
ecommercetimes.com/perl/story/366.html
Dell Inc. Australia. (2004). Dells privacy policy. Retrieved March 2, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/en/
privacy?c=au&l=en&s=gen
Dell Inc. Australia. (2005). Dells online policies. Retrieved March 28, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/en/au/
termsau?c=au&l=en&s=gen
Dell Inc. Australia. (2007). Online communication
policy. Retrieved June 5, 2007, from http://www.dell.
com/content/topics/global.aspx/corp/governance/en/online_comm?c=us&l=en&s=corp
Denoon, D. J. (2007). Internet drug pushing up again.
More websites advertising, selling controlled prescription drugs. Health News. Retrieved June 21, 2007, from
WebMD, http:// www.webmd.com/news/200770518/interent-drug-pushing-up-again
Department of Economic and Social Affairs (UN). (2003).
United nations guidelines for consumer protection (as
expanded in 1999). New York: United Nations.

Desai, M. S., Richards, T. C., & Desai, K. J. (2003). Ecommerce policies and customer privacy. Information
Manage & Computer Security, 11(1), 19-27.
Diaz, C., Seys, S., Claessens, J., & Preneel, B. (2003,
April). Towards measuring anonymity. In R. Dingledine
& P. F. Syverson, (Eds.), Proceedings of the 2nd International Workshop on Privacy Enhancing Technologies PET 2002, San Francisco, CA. Lecture Notes in
Computer Science (Vol. 2482, pp. 184-188). Heidelberg,
Germany: Springer.
Dietrich, W. (2006). Are journalists the 21st centurys
buggy whip makers? Nieman Reports, 60(4), 31-33.
Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information
Systems Research, 17(1), 61-80.
Dinev, T., & Hu, Q. (2007). The centrality of awareness
in the formation of user behavioral intention toward
protective information technologies. Journal of the Association for Information Systems, 8(7), 386-408.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I.., &
Coluatti, C. (2006). Internet users privacy concerns and
beliefs about government surveillance: An exploratory
study of differences between Italy and the United
States. Journal of Global Information Management,
14(4), 57-93.
Dingledine, R., Freedman, M. J., & Molnar, D. (2000).
The free haven project: Distributed anonymous storage
service. In H. Federrath (Ed.), Workshop on Design
Issues in Anonymity and Unobservability (pp. 6795).
Springer Verlag.
Dobosz, B., Green, K., & Sisler, G. (2006). Behavioral
marketing: Security and privacy issues. Journal of Information Privacy & Security, 2(4), 45-59.
Doe v. Roe, 93 Misc. 2d 201, 400 N.Y.S.2d 668 (Sup.
Ct. 1977).
Dominick, J. (1999). Who do you think you are? Personal
home pages and self-presentation on the world wide
web. Journalism and Mass Communication Quarterly,
76, 646-658.

9

Compilation of References

Donahey, M. S. (2003). The UDRP model applied to


online consumer transactions. Journal of International
Arbitration, 20(5), 475-491.
Donnellan, T. (1968). Lattice theory. Oxford, NY: Pergamon Press.
Dring, N. (2002). Personal home pages on the web:
A review of research. Journal of Computer Mediated
Communication 7(3). Retrieved December 26, 2003, from
http://jcmc.indiana.edu/vol7/issue3/doering.html
Down, M., & Sands, R. (2004). Biometrics: An overview
of the technology, challenges and control considerations.
Information Systems Control Journal, 4, 53.
Drennan, J., Mort, G. S., & Previte, J. (2006). Privacy, risk
perception, and expert online behavior: An exploratory
study of household end users. Journal of Organizational
and End User Computing, 18(1), 1-22.
Drugs and Crime Prevention Committee (Parliament
of Victoria). (2004). Inquiry into fraud and electronic
commercefinal report. Melbourne, Victoria: Parliament of Victoria.
Duffy, D. (2005). The evolution of customer loyalty
strategy. The Journal of Consumer Marketing, 22(4/5),
284-286.
Eastlick, M., Lotz, S., & Warrington, P. (2006). Understanding online B-to-C relationships: An integrated
model of privacy concerns, trust, and commitment.
Journal of Business Research, 59(8), 877-886.

A. Ribbers, H. C. A. van Tilborg, A. F. L. Veth, & J. G.


L. van der Wees (Eds.), Trust in electronic commerce.
The role of trust forms a legal, an organizational and
a technical point of view (pp. 11-43). The Hagues/London/New York: Kluwer Law International.
Electronic Frontier Foundation. (2007). Privacy section of
their web site. Retrieved July 12, 2007, from http://www.
eff.org/Privacy
Electronic Privacy Information Center (EPIC). (1997).
Surfer beware: Personal privacy and the internet. Washington, D.C.: Electronic Privacy Information Center.
Retrieved July 7, 2007, from http://www.epic.org/reports/surfer-beware.html
Electronic Privacy Information Center (EPIC). (2007).
Proposed Google/DoubleClick deal. Washington, D.C.:
Electronic Privacy Information Center. Retrieved July 7,
2007, from http://www.epic.org/privacy/ftc/google/
Electronic Privacy Information CenterEPIC (2007).
Privacy web site. Retrieved July 12, 2007, from http://
epic.org
Ellis, H. (2004). Modern Irish commercial and consumer
law. London: Jordan Publishing.
Email Security and Anonymity. (2004). Retrieved from
http://www.anonic.org/email-security.html
EMI. (2007, April 2). EMI music launches DRM-free
superior sound quality downloads across its entire digital
repertoire. EMI Group, Press Release.

Eden, J. M. (2005). When big brother privatizes: Commercial surveillance, the privacy act of 1974 and the future
of RFID. Duke Law & Technology Review, 20.

Endpointsecurity. (2004). What is endpoint security? Retrieved October 2007, from http://www.endpointsecurity.
org/Documents/What_is_endpointsecurity.pdf

EDRI. (2004). EU report: member states lazy to protect


data. Retrieved August 20, 2007, from http://www.edri.
org/edrigram/number2.24/report

English Parliament. (2006). Terrorism Act 2006, Queens


Printer of Acts of Parliament, UK.

EFF. (2005). Sony BMG Litigation Info, Electronic


Frontier Foundation. Retrieved April 16, 2007, from
http://www.eff.org/IP/DRM/Sony-BMG/
Egger, F. N. (2002). Consumer trust in e-commerce: From
psychology to interaction design. In J. E. J. Prins, P. M.

98

Englund, S., & Firestone, R. (2004). California law


regulates web site privacy policies. The Computer &
Internet Lawyer, 21(8), 22-23.
EPIC. (2002, June 5). Letter to house judiciary subcommittee on the courts, the internet, and intellectual
property. The Electronic Privacy Information Center

Compilation of References

and the Electronic Frontier Foundation. Retrieved April


16, 2007, from http://www.epic.org/privacy/drm/hjdrmltr6.5.02.html
Estes, A. (2007, May 6). Bill would ban lenders alerts
to homebuyers. Boston Globe, p. B3.
European Commission. (2005). Consumer protection
in the European Union: Ten basic principles. Brussels:
European Commission.
European Opinion Research Group. (2003). Data Protection, Special Euro barometer 196, Wave 60.0.
European Parliament. (1995). Directive 95/46/EC of the
European Parliament and of the Council of 24 October
1995 on the protection of individuals with regard to the
processing of personal data and on the free movement
of such data.
European Parliament. (2001). Report on the existence
of a global system for the interception of private and
commercial communications (ECHELON interception
system) (2001/2098(INI)).
European Union and United Stats of America. (2007).
Agreement between the European Union and the United
States of America on the processing and transfer of passenger name record (PNR) data by air carriers to the
United States Department of Homeland Security.
European Union. (2006). Directive 2006/24/EC of the
European Parliament and of the Council of 15 March 2006
on the retention of data generated or processed in connection with the provision of publicly available electronic
communications services or of public communications
networks and amending Directive 2002/58/EC.
Evans, M., & McKenna, B. (2000). Dragnet targets
Internet vandals. The Globe and Mail, February 10,
A1, A10.
Expert Group on Electronic Commerce (Australia).
(2003). Review of building consumer sovereignty in
electronic commerce: A best practice model for business.
Canberra, ACT: Treasury (Australia).
Facebook. (2007). Retrieved December 9, 2007, from
http://www.facebook.com/

Farrell, S., & Housley, R. (2002). RFC3281: An internet


attribute certificate profile for authorization. The Internet
Society. Network Working Group. Retrieved June 5, 2007,
from http://www.ietf.org/rfc/rfc3281.txt
Federal Bureau of Consumer Affairs (Australia). (1993).
Your consumer rights. In K. Healey (Ed.), Consumer
rights (Vol. 38). Balmain NSW: The Spinney Press.
Federal Bureau of Consumer Affairs (Australia). (1995).
Your consumer rights. In K. Healey (Ed.), Consumer
rights (Vol. 38). Balmain NSW: The Spinney Press.
Federal Privacy Act of 1974. (n.d.). Retrieved from
http://www.usdoj.gov/foia/privstat.htm
Federal Security Information Act of 2002. (2002). Retrieved from http://en.wikipedia.org/wiki/Federal_Information_Security_Management_Act_of_2002
Federal Trade Commission. (1996). Consumer information privacy hearings.
Federal Trade Commission. (1998). Privacy online: A
report to congress. Retrieved July 14, 2007, from http://
www.ftc.gov/reports/privacy3/index.shtm
Federal Trade Commission. (1998, June). Privacy online:
A report to congress. Retrieved January 2, 2001, from
http://www.ftc.gov/reports/privacy3
Federal Trade Commission. (1999). Childrens Online
Privacy Protection Rule, Federal Register, 64, 212.
(November 3), 59888-59915.
Federal Trade Commission. (1999, July). Self-regulation
and privacy online: A federal trade commission report to
congress. Retrieved January 2, 2001, from http://www.
ftc.gov/os/1999/9907/index.htm#13
Federal Trade Commission. (2000, May). Privacy online:
Fair information practices in the electronic marketplace:
A federal trade commission report to congress. Retrieved
January 2, 2001, from http://www.ftc.gov/os/2000/05/
index.htm#22
Ferguson, D. A., & Perse, E.M. (2000). The world wide
web as a functional alternative to television. Journal of
Broadcasting & Electronic Media, 44, 155-174.

99

Compilation of References

Ferraiolo, D., & Kuhn, R. (1992). Role-based access


controls. In Proceedings of the 15th NIST-NCSC National
Computer Security Conference (pp. 554-563).
Festa, P. (2002). Windows media aware of DVDs watched.
CNet News. Retrieved May 28, 2007, from http://news.
com.com/2100-1023-841766.html
Fielder, A. (2002). Better compliance: guidance, enforcement & self-regulation. Paper presented at the Data
Protection Conference and Report on the implementation
of Directive 95/46/EC, 2002. Retrieved October 13, 2007,
from http://ec.europa.eu/justice_home/fsj/privacy/docs/
lawreport/fielder_en.pdf
Fielding, J. (2007, January 28). 25% of all computers
on Botnets. Retrieved http://blogs.techrepublic.com.
com/networking/?cat=2
Financial News Online U.S. (2007). JPMorgan client data
loss. Story attributed to the Wall Street Journal, reported
on Financial News Online U.S. on May 1, 2007.
Fischer, K. (2007). Apple hides account info in DRM-free
music, too. Ars Technica. Retrieved July 8, 2007, from
http://arstechnica.com/news.ars/post/20070530-applehides-account-info-in-drm-free-music-too.html
Fischer-Hbner, S. (2001). IT-security and privacy-design
and use of privacy-enhancing security mechanisms. Lecture Notes on Computer Science, 1958. Heidelberg,
Germany: Springer.
Fischer-Hbner, S. (2003). Privacy enhancing technologies. Session 1 and 2, Ph.D. course, Winter/Spring 2003.
Karlstad, Sweden: Karlstad University. Retrieved June
5, 2007, from http://www.cs.kau.se/~simone/kau-phdcourse.htm
Fitzgerald, M., & Corvin, A. (2001). Diagnosis and differential diagnosis of Asperger syndrome. Advances in
Psychiatric Treatment, 7(4), 310-318.
Flavin, C., & Guinalu, M. (2006). Consumer trust,
perceived security and privacy policy: Three basic elements of loyalty to a web site. Industrial Management
& Data Systems, 106(5), 601-620.

00

Flinn, S., & Lumsden, J. (2005). User perceptions of


privacy and security on the web. Retrieved 12 October,
2007, from http://www.lib.unb.ca/Texts/PST/2005/pdf/
flinn.pdf
Flynn, N. (2005). E-policy best practices a business
guide to compliant & secure internet, instant messaging
(IM), peer-to-peer (P2P) and email communications.
The ePolicy Institute; Executive Director, St. Bernard
Software. Retrieved http://www.securitytechnet.com/resource/security/application/iPrism_ePolicy_Handbook.
pdf
Food and Drug Administration, FAQs. Retrieved
September 26, 2007, from http://www.fda.gov/oc/buyonline/faqs.html
Forder, J. (1999). The IIA code of practice: Co-regulation of the internet starts here. Retrieved March 31, 2007,
from http://epublications.bond.edu.au/law pubs/38
Forescout. (2007). NAC enforcement and the role of the
client. Infonetics Research, Inc. Retrieved July 2007, from
www.Forescout.com/downloads/whitepapers/Infonetics-NAC-Enforcement-and-the-Role-of-the-Client.pdf
Fost, D. (2007). The technology chronicles: the attack on Kathy Sierra. Retrieved March 27, 2007,
from http://www.sfgate.com/cgi-bin/blogs/sfgate/
detail?blogid=19&century_id=14783
Fox v. Smith, 594 So. 2d 596 (Miss. 1992). Retrieved June
1, 2007, from LexisNexis Cases, Law School database.
Fram, E. H., & Grady, D. B. (1997). Internet shoppers:
Is there a surfer gender gap? Direct Marketing, January, 46-50.
Fraser, A. (1999, August 8). Chronicle of Higher Education, 48, p. B8.
Fraser, S., & Henry, L. (2007). An exploratory study of
residential internet shopping in Barbados. Journal of
Eastern Caribbean Studies, 32(1), 1-20, 93.
Fraud costing Australia $1.1b a Year. (2006, April 7).
The Age.

Compilation of References

Fraud.org. (n.d.). Retrieved from http://www.fraud.


org/tips/internet/phishing.htm, http://www.phishinginfo.
org/
Friedman, B., Kahn, P., Hagman, J., Severson, R., & Gill,
B. (2006). The watcher and the watched: Social judgments about privacy in a public place. Human-Computer
Interaction, 21(2), 235-272.
FTC File No. 032 3221. (2004). Petco settles FTC charges.
Retrieved from http://www.ftc.gov/opa/2004/11/petco.
htm
FTC File No. 062-3073. (2006). Xanga.com to pay $1
million for violating childrens online privacy protection
rule. Retrieved from http://www.ftc.gov/opa/2006/09/
xanga.htm
FTC v. BJs Wholesale Club, Inc. (2005). Filing ordered
May 17, 2005. Retrieved from http://www.ftc.gov/os/
caselist/0423160/050616agree0423160.pdf
FTC. (1998). Privacy online: A report to congress.
Retrieved March 25, 2005, from http://www.ftc.gov/reports/privacy3/priv-23a.pdf
FTC. (2000). Privacy online: Fair information practices
in the electronic marketplace. Washington, DC: Federal
Trade Commission.
FTC. (2005). Federal trade commission privacy initiatives. Retrieved March 25, 2005, from http://www.ftc.
gov/privacy/index.html
FTC.gov Spyware. (2007). Retrieved from http://www.
ftc.gov/bcp/conline/pubs/alerts/spywarealrt.htm
FTC.gov. (2006). Retrieved from http://www.ftc.gov/bcp/
edu/microsites/idtheft/consumers/about-identity-theft.
html#Whatisidentitytheft
Fujimura, K., & Nishihara, T. (2003). Reputation rating system based on past behavior of evaluators. In
Proceedings of the 4th ACM Conference on Electronic
Commerce, San Diego, CA, USA (pp. 246-247). New
York: ACM Press.
Furnell, S. (2002). Cybercrime: Vandalizing the information society. Boston, MA: Addison-Wesley.

Gage, D. (2005, March 7). Shadowcrew: Web


mobstimeline: Cybercrime. Retrieved November
1, 2006, from http://www.baselinemag.com/article2/0,1397,1774786,00.asp
Garbarino, E., & Strahilevitz, M. (2004). Gender differences in the perceived risk of buying online and the
effects of receiving a site recommendation. Journal of
Business Research, 57(7), 768-775.
Garfinkel, S. (2002). Web security, privacy & commerce
(2nd ed.). Sebastopol, CA: OReilly & Associates.
Garfinkel, T., Pfaff, B., Chow, J., Rosenblum, M., &
Boneh, D. (2003). Terra: A virtual machine-based
platform for trusted computing. In Proceedings of 19th
ACM Symposium on Operating Systems Principles,
SOSP 2003, Bolton Landing, New York (pp. 193-206).
New York: ACM Press. Retrieved June 5, 2007, from
http://citeseer.ist.psu.edu/667929.html
Gefen, D., & Straub, D. W. (1997). Gender differences
in the perception and use of e-mail: An rxtension to the
yechnology acceptance model. MIS Quarterly, 21(4),
389-400.
GFI. (2007). The threats posed by portable storage
devices. Whitepaper. Retrieved July 2007, from http://
www.gfi.com/whitepapers/threat-posed-by-portablestorage-devices.pdf
Gilboa, W. (1996). Elites, lamers, narcs, and whores:
Exploring the computer underground. In. L. Cherny & E.
R. Weise (Eds.), Wired women: Gender and new realities
in cyberspace (pp. 98-113). Seattle, WA: Seal Press.
Gilliams, H. (2003, October). Self regulation by liberal
professions and the competition rules. Paper presented
at the Regulation of Professional Services Conference
organized by the European Commission, Brussels.
Gillies, L. (2001). A review of the new jurisdiction rules
for electronic consumer contracts within the European
Union. Journal of Information Law and Technology
(1).
Glaessner, T. C., Kellermann, T., & McNevin, V. (2004).
Electronic safety and soundness securing finance in a new

0

Compilation of References

age (World Bank Working Paper No. 26). Washington


DC Retrieved http://siteresources.worldbank.org/DEC/
Resources/abstracts_current_studies_2004.pdf
Godwin, M. (2006). Digital rights management: A guide
for librarians. Office for Information Technology Policy,
American Library Association.
Goldberg, I. (2000). A pseudonymous communications
infrastructure for the internet. Ph.D. Thesis, University
of California, Berkeley.
Goldberg, I. (2000). A pseudonymous communications
infrastructure for the internet. Ph.D. thesis, University
of California at Berkeley. Retrieved June 5, 2007, from
http://www.isaac.cs.berkeley.edu/ iang/thesis-final.pdf
Goldberg, I. (2003). Privacy-enhancing technologies for
the Internet, II:Five years later. In G. Goos, J. Hartmanis,
& J. van Leeuwen (Eds.), Second InternationalWorkshop
on Privacy Enhancing Technologies (PET 2002) (pp.
209-213). Springer Verlag.
Goldberg, I., Wagner, D., & Brewer, E. (1997). Privacyenhancing technologies for the Internet. In Proceedings
of IEEE COMPCON97 (pp. 103-110). IEEE Computer
Society Press.
Goldfarb v. Virginia State Bar, 421 U.S. 773, 792
(1975).
Gomez-Velez, N. (2005). Internet access to court recordsbalancing public access and privacy. Loyola Law
Review, 51, 365-438.
Goodwin, C. (1991). Privacy: Recognition of a consumer
right. Journal of Public Policy & Marketing, 10(1),
149-166.
Goodwin, C. (1991). Privacy: Recognition of a consumer
right. Journal of Public Policy & Marketing, 10(1),
149-166.
Google corporate information: Google milestones. (n.d.).
Retrieved November 25, 2006, from http://www.google.
com/corporate/history.html
Google privacy center: Privacy policy. (n.d.). Retrieved
November 10, 2006, from http://www.google.com/privacy.html

0

Google. (2007). Retrieved December 9, 2007, from


http://www.google.com/
Google. (2007). Retrieved December 9, 2007, from
http://toolbar.google.com/
Google. (2007c). Retrieved December 9, 2007, from
http://documents.google.com/
Googles advertising footprint. (2007, June 14). Retrieved
July 21, 2007, from http://www.eweek.com/slideshow/
0,1206,pg=0&s=26782&a=209549,00.asp
Gordon, L. A., Loeb M. P., Lucyshyn, W., & Richardson,
R. (2006). CSI/FBI computer crime and security survey.
Computer Security Institute. Retrieved November 2007,
from http://www.cse.msu.edu/~cse429/readings06/
FBI2006.pdf
Gordon, L. A., Loeb, M. P., Lucyshyn, W., & Richardson,
R. (2007). 2006 CSI/FBI computer crime survey. Retrieved March 3, 2007, from http://www.GoCSI.com
Goto, T. (2006, March). B to B, C to C, C to B, C to C.
Retrieved from Next Generation Electronic Commerce
Promotion Council of Japan: http://www.ecom.jp/journal/2007/ECOM_Journal_2007.pdf
Goto, T. (2007, March). B to B, C to C, C to B, C to C.
Retrieved from Next Generation Electronic Commerce
Promotion Council of Japan: http://www.ecom.jp/journal/2006/ECOMJournal2006.pdf
Grabner-Kraeuter, S. (2002). The role of consumers
trust in online-shopping. Journal of Business Ethics,
39(1-2), 43-50.
Grabosky, P., Smith, R. G., & Dempsey, G. (2001). Electronic theft: Unlawful acquisition in cyberspace. Cambridge and New York: Cambridge University Press.
Graeme, S. (2005). 30 years of protecting consumers
and promoting competition. Keeping Good Companies,
57(1 (Feb)), 38041.
Grami, A., & Schell, B. (2004). Future trends in mobile
commerce: Service offerings, technological advances,
and security challenges. Retrieved October 13, 2004,
from http://dev.hil.unb.ca/Texts/PST/pdf/grami.pdf

Compilation of References

Granade, P. F., & Sanders, J. H. (1996). Implementing


telemedicine nationwide: Analyzing the legal issues. 63
Defense Counsel, 67.
Granfield, A. (2000). FTCs bite out of internet fraud
lacks teeth. Forbes.com. Retrieved July 2, 2007, from
http://www.forbes.com/2000/03/27/mu2.html
Green, A. M. (2003). International privacy laws. Sensitive information in a wired world (Tech. Rep. No. CS
457). New Haven, CT: Yale University.
Greenleaf, G. (2000). Private sector bill amendments
ignore EU problems. Retrieved October 2007, from http://
www.austlii.edu.au/au/journals/PLPR/2000/30.html
Greenleaf, G. (2000). Safe harbors low benchmark for
adequacy: EU sells out privacy for U.S.$. Retrieved
October 2, 2007, from www.austlii.edu.au/au/journals/
PLPR/2000/32.html
Gregg, J. (2007). Senator Gregg introduces bill to increase safety for Americans buying on-line prescription
drugs. States News Service. Retrieved June 5, 2007, from
LexisNexis News library.
Gumpert, G., & Drucker, S. (1998). The demise of privacy
in a private world: From front porches to chat rooms.
Communication Theory, 8(4), 408-425.
Gurung, A. (2006). Empirical investigation of the relationship of privacy, security and trust with behavioral
intention to transact in e-commerce. Unpublished Dissertation, University of Texas at Arlington, Arlington.
Ha, H. (2005, October). Consumer protection in business-to-consumer e-commerce in Victoria, Australia.
Paper presented at the CACS 2005 Oceania Conference,
Perth, WA.
Ha, H. (2006, September-October). Security issues
and consumer protection in business-to-consumer
e-commerce in Australia. Paper presented at the 2nd
Australasian Business and Behavioural Sciences Association International Conference: Industry, Market,
and Regions, Adelaide.
Ha, H. (2007). Governance to address consumer protection in e-retailing. Unpublished doctoral thesis, Monash
University, Melbourne, Victoria.

Ha, Y., & Stoel, L. (2004). Internet apparel shopping


behaviors: The influence of general innovativeness.
International Journal of Retail & Distribution Management, 32(8/9), 377-385.
Hair, J. F., Jr., Anderson, R. E., Tatham, R. L., & Black,
th
W. C. (1998). Multivariate data analysis (5 ed.). UpperSaddle River, NJ: Prentice Hall.
Hall, S. (1980). Encoding/decoding. In S. Hall, D.
Hobson, A. Lowe, & P. Willis (Eds.), Culture, media,
language: Working papers in cultural studies. London:
Hutchinson.
Halstuk, M. E. (2005). When is an invasion of privacy
unwarranted under FOIA? An analysis of the supreme
courts sufficient reason and presumption of legitimacy standards. University of Florida Journal of Law
& Public Policy, 16(3), 361-400.
Halstuk, M. E., & Davis, C. N. (2002). The public interest
be damned: Lower court treatment of the reporters committee central purpose reformulation. Administrative
Law Review, 54(3), 983-1024.
Han, P., & Maclaurin, A. (2002). Do consumers really
care about online privacy? Marketing Management,
11(1), 35-38.
Hanson, K. (2006). Should the boss be blogging? Strategic
Communication Management, 10(2), 6-7.
Harland, D. (1987). The United Nations guidelines for
consumer protection. Journal of Consumer Policy,
10(2), 245-266.
Harland, D. (1999). The consumer in the globalised
information societythe impact of the international
organisations. Australian Competition and Consumer
Law Journal, 7(1999), 23.
Harley, D., Slade, R., & Gattiker, U. (2001). Viruses revealed: Understanding and counter malicious software.
New York: McGraw-Hill/Osborne.
Harrington, K. (1999). Legal implications of the practice of medicine over the internet, telemedicine and
cybermedicine. Cyberlaw. Retrieved July 2, 2007, from
http://www.gase.com/cyberlaw/toppage11.htm

0

Compilation of References

Harris, A. J., & Yen, D. C. (2002). Biometric authentication: Assuring access to information. Information
Management and Computer Security, 10(1), 12-19.
Harvard Law Review. (2007). Developments in the law
of media. 120(4), 990.
Haymarket Media. (2007). Spam hits records levels in
February. Retrieved March 20, 2007, from http://www.
crn.com.au/story.aspx?CIID=75798&r=rss
Health Insurance Portability and Accountability Act of
1996, 42 U.S.C. 1320(d).
Health Insurance Portability and Accountability Act
of 1996HIPPA (Kennedy-Kassebaum Act). (n.d.).
Retrieved from http://aspe.hhs.gov/admnsimp/pl104191.
htm
Hemphill, T. (2001). Identity theft: A cost of business?
Business and Society Review, 106(1), 51-63.
Hemphill, T. (2002). Electronic commerce and consumer privacy: Establishing online trust in the U.S.
digital economy. Business and Society Review, 107(2),
221-239.
Herbert, N. (2006). Conquering spam in concert: Antispam legislative efforts in the Asia Pacific region. Law
Technology, 39(2), 1-12.
Herring, S. C. (1992). Gender and participation in computer-mediated linguistic discourse. Washington, DC:
ERIC Clearinghouse on Languages and Linguistics,
Document no. ED 345552.

sandiego.com/uniontrib/20060316/news_lz1e16hinman.
html
Hoffman, D. L., & Novak, T. P. (1996). Marketing in hypermedia computer-mediated environments: conceptual
foundations. Journal of Marketing, 60(3), 50-68.
Hoffman, D. L., Novak, T. P. & Peralta, M. A. (1999,
April-June). Information privacy in the marketspace:
Implications for the commercial uses of anonymity on
the web. Information Society, 15(2), 129-139.
Hoffman, D. L., Novak, T. P., & Peralta, M. (1999).
Building consumer trust online. Communications of the
ACM, 42(4), 80-85.
Hoffman, D. L., Novak, T. P., & Peralta, M. A. (1999).
Information privacy in the marketspace: Implications
for the commercial uses of anonymity on the web. The
Information Society, 15(4), 129-139.
Hoffman, D., Novak, T. P., & Peralta, M. (1999). Building
consumer trust online. Association for Computing Machinery. Communications of the ACM, 42(4), 80-85.
Hofstede, G. (1997). Culture and organizations. New
York: McGraw Hill.
Holmes, S. (2006, May 22). Into the wild blog yonder.
Business Week, pp. 84-86.
Holt, T.J. (2006, June). Gender and hacking. Paper
presented at the CarolinaCon 06 Convention, Raleigh,
NC.

Higgins, K. (2007, November 9). The worlds biggest


botnets. Retrieved November 2007, from http://www.
darkreading.com/document.asp?doc_id=138610

Homeland Security Act, Cyber Security Enhancement


Act enacted December 13, 2001. (2002). Retrieved from
http://www.govtrack.us/congress/bill.xpd?bill=h1073482

Hildner, L. (2006). Defusing the threat of RFID: Protecting consumer privacy through technology-specific
legislation at the state level. Harvard Civil Rights-Civil
Liberties Law Review, 41(Winter), 133.

Hong, T., McLaughlin, M. L., Pryor, L., Beaudoin, C., &


Grabowicz, P. (2005). Internet privacy practices of news
media and implications for online journalism. Journalism
Studies, 6(2), 15-28.

Hine, C., & Eve, J. (1998). Privacy in the marketplace.


Information Society, 14(4), 253-262.

Hornle, J. (2005). Country of origin regulation in crossborder media: one step beyond the freedom to provide
services? International and Comparative Law Quarterly,
54, 89-126.

Hinman, L. (2006, March 16). Why Google matters.


Retrieved November 7, 2006, from http://www.signon-

0

Compilation of References

Horwedel, D. M. (2006). Blogging rights. Diverse Issues


in Higher Education, 23(2), 28-31.
Hsu, C. W. (2002). Online privacy issues: Comparison
between net users concerns and web sites privacy
statements. Paper presented to the 52nd Annual Conference of International Communication Association,
Seoul, Korea.
Hsu, C. W. (2006). Privacy concerns/privacy practices:
Toward a situational paradigm. Online Information
Review, 30(5), 569-586.
Hsu, C.-W. (2006). Privacy concerns, privacy practices and web site categories. Online Information Review, 30(5), 569-586.
http://findarticles.com/p/articles/mi_m0NEW/is_2001_
May_29/ai_75139832
http://www.computerweekly.com/Articles/Article.
aspx?liArticleID=208948&PrinterFriendly=true
Hu, Q., & Dinev, T. (2005). Is spyware an internet nuisance or public menace? Communications of the ACM,
48(8), 61-66.
Hu, X., Lin, Z., & Zhang, H. (2003). Trust promoting
seals in electronic markets: An exploratory study of
their effectiveness for online sales promotion. Journal
of Promotion Management, 9(1/2), 163-180.
Huffmann, H. (2004). Consumer protection in e-commerce. University of Cape Town, Cape Town.
Hui, K. L., Tan, B., & Goh, C. Y. (2006). Online information disclosure: Motivators and measurements. ACM
Transactions on Internet Technology, 6(4), 415-441.
Hui, K.-L., Teo, H. H., & Lee, S.-Y. T. (2007). The value
of privacy assurance: An exploratory field experiment.
MIS Quarterly, 31(1), 19-33.
Humphers v. First Interstate Bank, 696 P.2d 527 (Or.
1985).
Hupfer, M., & Detlor, B. (2007). Beyond gender differences: Self-concept orientation and relationship-building applications on the internet. Journal of Business
Research, 60(6), 613-619.

IBM Privacy Research Institute. (2007). Armonk, NY:


IBM. Retrieved June 5, 2007, from http://www.research.
ibm.com/privacy/
IBM Research. (2006). Global security analysis lab:
Fact sheet. IBM Research. Retrieved January 16, 2006,
from
http://domino.research.ibm.com/comm/pr.nsf/
pages/rsc.gsal.html
Im, G. P., & Baskerville, R. L. (2005, Fall). A longitudinal
study of information system threat categories: The enduring problem of human error. ACM The DATA BASE for
Advances in Information Systems, 36(4), 68-79.
Imaio, K. (2005). Michinoku bank privacy data lost.
Retrieved from IT pro: http://itpro.nikkeibp.co.jp/free/
NIP/NIPCOLUMN/20050616/162853/
Information Law Branch. (undated). Information paper
on the introduction of the privacy amendment (private
sector) bill 2000. Barton, ACT: Attorney Generals Department (Australia).
International Organization for Standardization. (1999).
Common criteria for information technology security
evaluation (ISO IS 15408). Retrieved on July 12, 2007,
from http://www.commoncriteriaportal.org/
International Organization for Standardization. (2006).
Information technologysecurity techniquesencryption algorithmspart 2: Asymmetric ciphers (ISO/IEC
18033-2).
Internet Industry Association (Australia). (2006a). Content code. Retrieved March 31, 2007, from http://www.
iia.net.au/index.php?option=com_content&task=categ
ory&sectionid=3&id=19&Itemid=33
Internet Industry Association (Australian). (2006b).
About the IIA. Retrieved March 24, 2007, from http://
www.iia.net.au/index.php?option=com_content&task=
section&id=7&Itemid=38
Internet Security Glossary. (2007). The internet society.
Retrieved June 5, 2007, from www.faqs.org/rfcs/rfc2828.
html
Internet Spyware Prevention Act of 2007, H.R. 1525,
110th Congress, 1st Session, (2007).

0

Compilation of References

Internet usage world statsInternet and population.


(n.d.). Retrieved November 12, 2006, from http://www.
Internetworldstats.com
Investigation result of data leak from Lawson. (2003).
Retrieved from CNET Japan: http://japan.cnet.com/news/
media/story/0,2000056023,20060378,00.htm
Iredale, W., & Swinford, S. (2007, February 25). Web supplies prescription drug addicts. Sunday Times (London).
Retrieved June 3, 2007, from LexisNexis News file.
Jackson, M. (2003). Internet privacy. Telecommunications
Journal of Australia, 53(2), 21-31.
Jain, A. K., Hong, L., & Pankanti, S. (2000). Biometric
identification. Communications of the ACM, 43(2),
90-98.
Jain, A. K., Ross, A., & Prabhakar, S. (2003). Biometric
recognition: Security and privacy concerns. Security &
Privacy Magazine, IEEE, 1(2), 33-42.
Jain, A. K., Ross, A., & Prabhakar, S. (2004). An introduction to biometric recognition. Circuits and systems
for video technology. IEEE Transactions, 14(1), 4-20.
James, M. L., & Murray, B. E. (2003). Computer crime
and compromised commerce (Research Note No. 6). Canberra, ACT: Department of the Parliamentary Library.
Janda, S., Trocchia, P., & Gwinner, K. (2002). Consumer
perceptions of internet retail service quality. International Journal of Service Industry Management, 13(5),
412-431.
Jansen, T. W., & Peen, S. (2007). Privacy i offentlige
systemer. Masters thesis, Informatics and Mathematical Modelling, Technical University of Denmark (in
Danish).
Jenkins, H., & Boyd, D. (2006). MySpace and deleting
online predators act (DOPA). MIT Tech Talk. Retrieved
October 12, 2007, from http://web.mit.edu/cms/People/
henry3/MySpace.pdf
Jensen, F.V. (1996). An introduction to Bayesian networks.
London: UCL Press.

0

Jhu, C. H., & Yung, H. C. (2006, February 17). The hit


beauty in online photo album becomes a TV anchor and
a celebrity in showbiz. Ettoday. Retrieved March 26,
2006, from http://www.ettoday.com/2006/02/17/108451906739.htm
Johnson, A. (June 2006). Digital doctors. National
Conference of State Legislatures Magazine. Retrieved
June 1, 2007, from http://www.ncsl.org/programs/pubs/
slmag/2006/org
Johnson, D., & Post, D. (1996). Law and bordersthe rise
of law in cyberspace. 48 Stanford Law Review, 1367.
Johnson, M. L. (2004). Biometrics and the threat to civil
liberties. Computer, 37(4), 9092.
Johnston, C. A. (1983). Greentree v. united states customs
service: A misinterpretation of the relationship between
FOIA exemption 3 and the privacy act. Boston University
Law Review, 63, 509-531.
Joines, J. L., Scherer, C. W., & Scheufele, D. A. (2003).
Exploring motivations for consumer web use and their
implications for e-commerce. The Journal of Consumer
Marketing, 20(2/3), 90-108.
Jonczy, J., & Haenni, R. (2005). Credential networks:
A general model for distributed trust and authenticity
management. Retrieved on October 10, 2007, from http://
www.lib.unb.ca/Texts/PST/2005/pdf/jonczy.pdf
Jones, M. (1991). Privacy: A significant marketing issue
for the 1990s. Journal of Public Policy & Marketing,
10(spring), 133-148.
Jordan, T., & Taylor, P. (1998). A sociology of hackers.
The Sociological Review, 46(4),757-780.
Juniper Research. (2006, February). Security information
& event management. Retrieved http://www.juniper.
net/solutions/literature/solutionbriefs/351178.pdf
Kai, H. (2004). Comment from President of Softbank
Co. Retrieved from Internet Watch: http://internet.watch.
impress.co.jp/cda/news/2004/02/27/2257.html
Kang, H., & Yang, H. (2006). The visual characteristics
of avatars in computer-mediated communication: Com-

Compilation of References

parison of internet relay chat and instant messenger as


of 2003. International Journal of Human-Computer
Studies, 64(12), 1173-1183.
Kate, N. (1998, January). Women want privacy. American
Demographics, 20(1), 37.
Katz, E., Blumler, J., & Gurevitch, M. (1974). Utilization
of mass communication by the individual. In J. G. Blumler & E. Katz (Eds.), The uses of mass communication:
Current perspectives on gratifications research (pp.
19-34). Beverly Hills, CA: Sage.
Katz, E., Gurevitch, M., & Hass, S. (1973). On the use of
mass media for important things. American Sociological
Review, 38,164-181.
Kaufman, J. H., Edlund, S., Ford, D. A., & Powers, C.
(2005). The social contract core. Electronic Commerce
Research, 5(1), 141-165.
Kaye, B. K. (1998). Uses and gratifications of the world
wide web: From couch potato to web potato. The New
Jersey Journal of Communication, 6, 21-40.
Keeney, M., Kowalski, E., Cappelli, D., Moore, A.,
Shimeall, T., & Rogers S. (2005). Insider threat study:
Computer system sabotage in critical infrastructure
sectors. U.S Secret Service and CERT Coordination
Center/SEI. Retrieved November 2007, from http://www.
CERT. org/archive/pdf/insidercross051105.pdf
Kehoe, C., Pitkow, J., Sutton, K., Aggarwal, G., &
Rogers, J. D. (1999). Results of GVUs tenth world wide
web user survey. Retrieved November 16, 2006, from
http://www.gvu.gatech.edu/user_surveys/survey-199810/tenthreport.html
Keith, W. B. (2005). Internet addition: A review of current
assessment techniques and potential assessment questions. Cyberpsychology & Behavior, 8(1), 7-14.
Kellan, A. (2001). Whiz kid has fingerprints all over
New Napster. CNN Technology News, July 5. Retrieved
April 16, 2007, from http://edition.cnn.com/2001/TECH/
internet/07/05/napster.fingerprint/index.html
Keller & Murray, (1999). IT law in the European Union.
London: Sweet and Maxwell.

Keller, L. S. (1988, July). Machismo and the hacker


mentality: some personal observations and speculations.
Paper presented to the WiC (Women inComputing)
Conference.
Kelley, C. M., Denton, A., & Broadbent, R. (2001). Privacy concerns cost ecommerce $15 billion. Cambridge,
MA: Forrester Research.
Kelly, E. P., & Erickson, G. S. (2004). Legal and privacy
issues surrounding customer databases and e-merchant
bankruptcies: Reflections on toysmart.com. Industrial
Management & Data Systems, 104(3), 209-217.
Kershaw, B. (1994). Framing the audience for theater.
In R. Keat, N. Whiteley, & N. Abercrombie (Eds.), The
authority of the consumer. London: Routledge.
Kesmodel, D., & Wilkie, J. R. (2007). Whole foods is
hot, wild oats a dudso said Rahodeb. The Wall Street
Journal, 250(9), A1.
Kilbourne, W., & Weeks, S. (1997). A socio-economic
perspective on gender bias in technology. Journal of
Socio-Economics, 26(1), 243-260.
Kilgore, H. E. (2004). Signed, sealed, protected: Solutions
to agency handling of confidential business information
in informal rulemaking. Administrative Law Review,
56(2), 519-534.
Kim, D. J., Steinfield, C., & Lai, Y. (2004). Revisiting
the role of web assurance seals in consumer trust. In M.
Janssen, H. G. Sol, & R. W. Wagenaar (Eds.), Proceedings of the 6th International Conference on Electronic
Commerce (ICEC 04, 60) (pp. 280-287). Delft, The
Netherlands: ACM Press.
Kim, S., Williams, R., & Lee, Y. (2003). Attitude toward
online shopping and retail website quality: A comparison
of US and Korean consumers. Journal of International
Consumer Marketing, 16(1), 89-111.
Kimery, K. M., & McCord, M. (2002). Third-party assurances: mapping the road to trust in e-retailing. Journal
of Information Technology Theory and Application,
4(2), 63-82.

0

Compilation of References

Kimery, K., & McCord, M. (2006). Signals of trustworthiness in e-commerce: Consumer understanding
of third-party assurance seals. Journal of Electronic
Commerce in Organizations, 4(4), 52-74.

Krill, P. (2002). DoubleClick discontinues web tracking service. ComputerWorld. Retrieved February 24,
2002, from http://www.computerworld.com/storyba/
0,4125,NAV47_STO67262,00.html

King, J., Bond, T., & Blandford, S. (2002). An investigation of computer anxiety by gender and grade. Computers
in Human Behavior, 18, 69-84.

Krone, T. (2005). Concepts and terms. Canberra: The


Australian High Tech Crime Centre.

King, N. J., Pillay, S., Lasprogata, G. A. (Spring 2006).


Workplace privacy and discrimination issues related to
genetic data: A comparative law study of the European
Union and the United States. American Business Law
Journal, 43. Retrieved June 4, 2007, from https://ezproxy.
royalroads.ca
Kirk, J. (2007, May 17). Estonia recovers from massive
denial-of-service attack. InfoWorld, IDG News Service.
Retrieved November 2007, from http://www.infoworld.
com/article/07/05/17/estonia-denial-of-service-attack_1.html
Klang, M. (2004). Spyware-the ethics of covert software.
Ethics and Information Technology, 6, 193-202.

Krone, T. (2006). Gaps in cyberspace can leave us vulnerable. Platypus Magazine, 90 (March 2006), 31-36.
Kryczka, K. (2004). Ready to join the EU information society? Implementation of e-commerce directive
2000/31/EC in the EU acceding countriesthe example
of Poland. International Journal of Law & Information
Technology, 12, 55.
Kuner, C. (2007). European data protection law:
Corporate regulation and compliance. USA: Oxford
University Press.
Lade, D (2007, July 15). Getting medication in privacy
is part of internet appeal, but there are risks. Sun Sentinel. Retrieved September 17, 2007, from LexisNexis
current News file.

Kohlman, R. J. (2006). Existence of physician patient


relationship. 46 Am. Jur., Proof of Facts 2d 373. 1,
2, 3, 8.

Lahey, K. (2005, August 30). Red tape on a roll...and it


must stop. The Age, 8.

Kolence, K. W., & Kiviat, P. J. (1973). Software unit


profiles & Kiviat figures. ACM SIGMETRICS Performance Evaluation Review, 1973(2), 2-12.

Large, A., Beheshti, J., & Rahman, T. (2002). Gender


differences in collaborative web searching behavior:
An elementary school study. Information Processing
& Management, 38, 427-443.

Korgaonkar, P. K., & Wolin, L. D. (1999). A multivariate


analysis of web usage. Journal of Advertising Research,
39(2), 53-68.
Kramer, D. (2007). American jurisprudence: Supremacy
of constitutions; supreme laws (2nd ed.). Minneapolis:
The West Publishing Company.
Krebs, B. (2001). Etour.com data sales violate policy.
Newsbytes News Network. Retrieved July 13, 2007,
from
Krebs, B. (2003). Hackers to face tougher sentences.
Washington Post. Retrieved Feburary 4, 2004, from
http://www.washingtonpost.com/ac2/wp-dyn/A352612003Oct2?language=printer

08

LaRose, R., Mastro, D. A., & Eastin, M. S. (2001). Understanding internet usage: A social cognitive approach
to uses and gratifications. Social Computer Review, 19,
395-413.
Lauer, T., & Deng, X. (2007). Building online trust
through privacy practices. International Journal of
Information Security, 6(5), 323-331.
Lawson, P., & Lawford, J. (2003). Identity theft: The
need for better consumer protection. Ottawa: The Public
Interest Advocacy Centre.
Lawson, S. (2006, Nov. 29). Google describes its wi-fi
pitch. Retrieved December 1, 2006, from ://www.pcworld.
com/article/id,123157-page,1/article.html

Compilation of References

Lea, M., Spears, R., & de Groot, D. (2001). Knowing


me, knowing you: Anonymity effects on social identity
processes within groups. Personality and Social Psychology Bulletin, 27, 526-537.
Leahy, P. (1998). The electronic FOIA amendments of
1996: Reformatting the FOIA for on-line access. Administrative Law Review, 50(2), 339-344.
Lederer S., Mankoff, J., & Dey, A. (2003). Towards a
deconstruction of the privacy space. In Proceedings of the
Ubicomp communities: privacy as boundary negotiation
Workshop on Privacy at Ubicomp2003.
Leitzsey, M. (2006). Facebook can cause problems for
students. Online PacerTimes. Retrieved October 2, 2007,
from http://media.www.pacertimes.com/media/storage/
paper795/news/2006/01/31/News/Facebook.Can.Cause.
Problems.For.Students-1545539.shtml
Lekakis, G. (2005). Computer crime: The Australian facts
and figures. Retrieved April 7, 2007, from http://www.
crime-research.org/news/19.07.2005/1373/
Lelia, G. (2001). Treating internet users as audiences:
Suggesting some research directions. Australian Journal
of Communication, 28(1), 33-42.
Levary, R., Thompson, D., Kot, K., & Brothers, J. (2005).
Radio frequency identification: Legal aspects. Richmond
Journal of Law & Technology, 12(Fall), 6.
Levy, S. (1984). Hackers: Heroes of the computer revolution. New York: Dell.
Levy, S., & Gutwin, C. (2005). Security through the
th
eyes of users. In Proceedings of the 14 International
Conference on the World Wide Web (pp. 480-488).
Chiba, Japan.
Lewandowski, J. (2002). Stepping off the sidewalk: An
examination of the data collection techniques of web
sites visited by children. Doctoral dissertation, Purdue
University, W. Lafayette, IN.
Lewis, C. A. (2004). Cybermedicine: Do you need to know
or see your physician, or is it safe to practice medicine
in cyberspace? Retrieved March 28, 2007, from http://
gsulaw.gsu.edu/lawand/papers/fa04/lewis/doc/htm

Lichtenstein, S., Swatman, P., & Babu, K. (2003). Adding value to online privacy for consumers: remedying
deficiencies in online privacy policies with a holistic approach. In Proceedings of the 36th Hawaii International
Conference on System Sciences.
LII. (2007). US Code Collection, Title 17, Copyrights,
Legal Information Institute, Cornell Law School.
LII. (2007). U.S. Code Collection, Title 18, S 2510, The
Electronic Communications Privacy Act of 1986, Legal
Information Institute, Cornell Law School.
LII. (2007). United States Constitution Article I Section
8, Legal Information Institute, Cornell Law School.
LII. (2007c). The Bill Of Rights: U.S. Constitution,
Amendment IV, Legal Information Institute, Cornell
Law School.
Lilien, L., & Bhargava, B. (2006). A scheme for privacypreserving data dissemination. IEEE Transactions on
Systems, Man and Cybernetics, Part A: Systems and
Humans, 36(3), 503-506.
Lilien, L., Gupta, A., & Yang, Z. (2007). Opportunistic
networks for emergency applications and their standard
implementation framework. In Proceedings of the First
International Workshop on Next Generation Networks for
First Responders and Critical Infrastructure, NetCri07,
New Orleans, LA (pp. 588-593).
Lilien, L., Kamal, Z. H., Bhuse, V., & Gupta, A. (2006).
Opportunistic networks: The concept and research challenges in privacy and security. In P. Reiher, K. Makki,
& S. Makki (Eds.), Proceedings of the International
Workshop on Research Challenges in Security and
Privacy for Mobile and Wireless Networks (WSPWN06),
Miami, FL (pp. 134-147).
Lin, C. A. (1996). Looking back: The contribution of
Blumler and Katzs uses of mass communication to
communication research. Journal of Broadcasting &
Electronic Media, 40(4), 574-581.
Lisse, J. (2007). Bechet disease. Retrieved June 3, 2007,
from http://www. emedicine.com/med.topic218.htm

09

Compilation of References

Liu, C., Marchewka, J., Lu, J., & Yu, C. (2004). Beyond
concern: A privacy-trust-behavioral intention model
of electronic commerce. Information & Management,
42(1), 127-142.

Mansoorian, A. (2006). Measuring factors for increasing


trust of people in e-transactions. Lulea, Sweden: Lulea
University of Technology.

Liu, S., & Silverman, M. (2001). A practical guide to


biometric security technology. IT Professional, IEEE,
3(1), 27-32.

Mansor, P. (2003, April-May). Consumer interests in


global standards. Paper presented at the Global Standards Collaboration (GSC) 8User Working Group
Session, Ottawa.

Livingstone, S. (2004). The challenge of changing audiences: Or, what is the audience researcher to do in the
age of the Internet? European Journal of Communication, 19(1), 75-86.

Marchiori, M. (2002). The platform for privacy preferences 1.0 (P3P1.0) specification. W3C recommendation. W3C. Retrieved June 5, 2007, from http://www.
w3.org/TR/P3P/

Lohmann, F. (2002). Fair Use and Digital rights management, Computers, Freedom & Privacy, Electronic
Frontier Foundation.

Marsh, S. (1994). Formalizing trust as a computational


concept. Ph.D. thesis, University of Stirling, U.K.

Longhurst, B., Bagnall, G., & Savage, M. (2004). Audiences, museums and the English middle class. Museum
and Society, 2(2), 104-124.
Lurie, P., & Zieve, A. (2006). Sometimes the silence
can be like thunder: Access to pharmaceutical data at
the FDA. Law and Contemporary Problems, 69(Summer), 85-97.
Lynch, E. (1997). Protecting consumers in the cybermarket. OECD Observer, 208(Oct/Nov), 11-15.
Mack, A. (2000, January 3). Op-Ed Re: Sometimes the
patient knows best. New York Times. Retrieved June 5,
2007, from LexisNexis News service.
MacRae, P. (2003). Avoiding eternal spamnation.
Chatswood, NSW: Australian Telecommunications Users
Group Limited (ATUG).
Majoras, D. P., Swindle, O., Leary, T. B., Harbour, P. J., &
Leibowitz, J. (2005). The US SAFE WEB Act: Protecting
consumers from spam, spyware, and fraud. A legislative
recommendation to congress. Washington D.C.: Federal
Trade Commission (US).
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004). Internet users information privacy concerns (IUIPC): The
construct, the scale, and a causal model. Information
Systems Research, 15(4), 336-355.

0

Marshall, A. M., & Tompsett, B. C. (2005). Identity theft


in an online world. Computer Law & Security Report,
21, 128-137.
Martin, A. (2007, June 14). Kellogg to phase out some
food ads to children. The New York Times. Retrieved
July 11, 2007, from www.nytimes.com/2007/06/14/
business/14kellogg.html?ex=1183953600&en=8f34f9b
e3c1620f7&ei=5070
Matyas, V., Jr., & Riha, Z. (2003). Toward reliable user
authentication through biometircs. Security & Privacy
Magazine, IEEE, 1(3), 45-49.
Maxeiner, J. R. (1995). Freedom of information and the
EU data protection directive. Federal Communications
Law Journal, 48(December), 93-104.
Mayer, R. N. (2002). Shopping from a list: International
studies of consumer online experiences. Journal of
Consumer Affairs, 36(1), 115-126.
McAfee, J., & Haynes, C. (1989). Computer viruses,
worms, data diddlers, killer programs, and other threats
to your system. New York: St. Martins Press.
McElhearn, K. (2006). iSpy: More on the iTunes MiniStore and privacy. Kirkville. Retrieved July 7, 2007, from
http://www.mcelhearn.com/article.php?story=20060112
175208864
McGillicuddy, S. (2006, November 1). Encrypting mo-

Compilation of References

bile devices: A best practice no one uses SearchSMB.


com http://searchSMB.techtarget.com/originalContent/
0,289142,sid44_gci1227295,00.html?asrc=SS_CLA_
300336&psrc=CLT_44
McGrath, R. (2005). Only a matter of time: Lessons
unlearned at the food and drug administration keep
Americans at risk. Food and Drug Law Journal, 60,
603-624.
McKenna, K. Y. A., & Bargh, J. A. (2000). Plan 9 from
cyberspace: The implications of the internet for personality and social psychology. Personality and Social
Psychology Review, 4, 57-75.
McKnight, D., Choudhury, V., & Kacmar, C. (2002).
Developing and validating trust measures for e-commerce: An integrative topology. Information Systems
Research, 13(3), 334-359.
McKnight, D. H., & Chervany, N. L. (2001). Conceptualizing trust: A typology and e-commerce customer
relationships model. In Proceedings of the 34th Annual
Hawaii International Conference on System Sciences,
HICSS-34, Island of Maui, Hawaii (Volume 7, 10 pages).
Washington, D.C.: IEEE Computer Society. Retrieved
June 5, 2007, from http://csdl2.computer.org/comp/proceedings/hicss/2001/0981/07/09817022.pdf
McKnight, D. H., & Chervany, N. L. (2001). What trust
means in e-commerce customer relationships: An interdisciplinary conceptual typology. International Journal
of Electronic Commerce, 6(2), 35-59.
McKnight, D. H., Choudhury, V., & Kacmar, C. (2000).
Trust in e-commerce vendors: A two-stage model. In
Proceedings of the Association for Information Systems
(pp. 532-536). Atlanta, GA.
McRobb, S., & Rogerson S. (2004). Are they really listening? An investigation into published online privacy
policies at the beginning of the third millennium. Information Technology & People, 17(4), 442-461.
McSweeney, B. (2002). Hofstedes model of national
cultural differences and their consequences: A triumph
of faith-A failure of analysis. Human Relations,
55(1), 89-118.

Media Week marketers guide to media (Vol. 29). (2006).


New York: VNU Business Publications.
MedicineNet. (2007). Orphan disease definition.
Retrieved June 4, 2007, from http://www.medterms.
com/script/main/art.asp?articlekey=11418
Medlaw.com. (2006). Retrieved from http://www.
medlaw.com/healthlaw/Medical_Records/8_4/womanpleads-guilty-to-se.shtml
Meinel, C. (2006). Appendix A: How do hackers break
into computers? In B. Schell &
Mencik, S. (1999). Are secure internet transactions really secure? Retrieved June 6, 2007, from http://www.
jsweb.net/paper.htm
Mercuri, R. T. (2004). The HIPAA-potamus in health
care data security. Communications of the ACM, 47(7),
25-28.
Metz, C. (2003, February 27). Is Google invading your
privacy? Retrieved December 2, 2006, from http://www.
pcmag.com/article2/0,4149,904096,00.asp
Metz, C. (2005, August 23). Identity theft is out of control.
PC Magazine, 87-88.
Metz, J. (2004). Practical insights to HIPAA: Overview
and general background regarding HIPAA. Retrieved
May 21, 2007, from http://www.dilworthlaw.com/pdf/
hipaa.pdf
Metzger, M. (2006). Effects of site, vendor, and consumer
characteristics on web site trust and disclosure. Communication Research, 33(3), 155-179.
Meyer, G. R. (1989). The social organization of the
computer underworld. Retrieved October 13, 2007, from
http://bak.spc.org/dms/archive/hackorg.html
Meyers-Levy, J. & Maheswaran, D. (1991). Exploring
differences in males and females processing strategies.
Journal of Consumer Research, 18(June), 63-70.
Michael, P., & Pritchett, E. (2001). The impact of HIPPA
electronic transmissions and health information privacy
standards. American Dietetic Association. Journal of the
American Dietetic Association, 101(5), 524-528.



Compilation of References

Microsoft. (2003). Windows media player 9 series privacy settings. Retrieved July 7, 2007, from http://www.
microsoft.com/windows/windowsmedia/player/9series/
privacy.aspx
Milberg, S. J., Smith, H. J., & Burke, S. J. (2000). Information privacy: Corporate management and national
regulation. Organizational Science, 11(1), 35-57.
Miller, R. D. (2006). Health care law. Sudbury, Massachusetts: Jones Bartlett Publishers.
Milloy, M., Fink, D., & Morris, R. (2002, June). Modelling online security and privacy to increase consumer
purchasing intent. Paper presented at the Informing
Science + IT Education Conference, Ireland.
M i l ls , E . (20 05, Ju ly 14). CN ET.c om. Re trieved November 7, 2006, from news.com.com/
Google+balances+privacy,+reach/2100-1032_3Milne, G. R. (2000). Privacy and ethical issues in database/interactive marketing and public policy: A research
framework and overview of the special issue. Journal of
Public Policy & Marketing, 19(1), 1-6.
Milne, G. R. (2003). How well do consumer protect
themselves from identity theft? The Journal of Consumer
Affairs, 37(2), 388-402.
Milne, G. R., & Culnan, M. J. (2002). Using the content
of online privacy notices to inform public policy: A
longitudinal analysis of the 1998-2001 U.S. web surveys.
The Information Society, 18(5), 345-359.
Milne, G. R., Culnan, M. J., & Greene, H. (2006). A longitudinal assessment of online privacy notice readability.
Journal of Public Policy & Marketing, 25(2), 238-249.
Milne, G. R., & Rohm, A. J. (2000). Consumer privacy
and name removal across direct marketing channels:
Exploring opt-in and opt-out alternative. Journal of
Public Policy & Marketing, 19(2), 238-249.
Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity. Journal
of Consumer Affairs, 38(2), -232217.



Milne, G., Rohm, A., & Bahl, S. (2004). Consumers


protection of online privacy and identity. Journal of
Consumer Affairs, 38(2), 217-233.
Milne, G.R. (1997). Consumer participation in mailing
lists: A field experiment. Journal of Public Policy &
Marketing, 16, 298-309.
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online retailer
disclosures. Journal of Public Policy & Marketing,
19(1), 54-61.
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online retailer
disclosures. Journal of Public Policy & Marketing,
19(Spring), 54-61.
Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks for online shopping.
The Journal of Consumer Affairs, 35(1), 27-44.
Miyazaki, A. D., & Fernandez, A. (2001). Consumer
perceptions of privacy and security risks for online shopping. Journal of Consumer Affairs, 35(1), 27-44.
Miyazaki, A., & Krishnamurthy, S. (2002). Internet
seals of approval: effects on online privacy policies and
consumer perceptions. Journal of Consumer Affairs,
36(1), 28-49.
Mizutani, M., Dorsey, J., & Moor, J. (2004). The internet
and Japanese conception of privacy. Ethics and Information Technology, 6(2), 121-128.
Moerel, L. (2001). The country of origin principle in
the e-commerce directive: the expected one stop shop.
Computer and Telecommunications Law Review, 7(7),
184-190.
Moghe, V. (2003). Privacy managementa new era in the
Australian business environment. Information Management & Computer Security, 11(2), 60-66.
Montana, J. C. (2001). Data mining: A slippery slope.
Information Management Journal, 35(4), 50-52.
Moor, J. H. (1997). Towards a theory of privacy in the
information age. Computers and Society, 27(3), 27-32.

Compilation of References

Moores, T. (2005). Do consumers understand the role


of privacy seals in e-commerce? Communication of the
ACM, 48(3), 86-91.

from http://csdl2.computer.org/comp/proceedings/
hicss/2002/1435/07/14350188.pdf

Moores, T. (2005). Do consumers understand the role of


privacy seals in e-commerce? Communications of the
ACM, 48(3), 86-91.

Mukherjee, A., & Nath, P. (2007). Role of electronic


trust in online retailing: A re-examination of the commitment-trust theory. European Journal of Marketing,
41(9/10), 1173-1202.

Moores, T. (2005). Do consumers understand the role of


privacy seals in e-commerce? Communications of the
ACM, 48(3), 86-91.

Mulligan, D. (2004). Privacy and information goods.


FTC RFID Workshop, June 21. www.ftc.gov/bcp/worshops/rfid

Morinaga, S., Yamanishi, K., Tateishi, K., & T. Fukushima, T. (2002). Mining product reputations on the web.
In Proceedings of the 8th ACM SIGKDD International
Conference on Knowledge Discovery and Data Mining
(pp. 341349). New York: ACM Press. Retrieved June
5, 2007, from citeseer.ist.psu.edu/morinaga02mining.
html

Mulligan, D., & Schwartz, A. (2000). P3P and privacy:


An update for the privacy community. Center for Democracy & Technology. Retrieved April 16, 2007, from
http://www.cdt.org/privacy/pet/p3pprivacy.shtml

Morris, M., & Ogan, C. (1996). The internet as a mass


medium. Journal of Communication, 46(1), 39-50.
Moser v. Stallings, 387 N.W.2d 599 (Iowa 1986). Retrieved
June 1, 2007 from LexisNexis law School database.
Motion, P. (2001). The Brussels regulation and e-commerce a premature solution to a fictional problem.
Computer and Telecommunications Law Review, 7(8),
209-215.
Moulinos, K., Iliadis, J., & Tsoumas, V. (2004). Towards
secure sealing of privacy policies. Information Management & Computer Security, 12(4), 350-361.
Mowbray, A. (2007). Cases and materials on the European convention on human rights (4th ed.). Oxford
University Press.
Mui, L. (2002). Computational models of trust and reputation: Agents, evolutionary games, and social networks.
Ph.D. thesis, Massachusetts Institute of Technology.
Mui, L., Mohtashemi, M., & Halberstadt, A. (2002).
A computational model of trust and reputation for ebusinesses. In Proceedings of the 35th Annual Hawaii
International Conference on System Sciences, HICSS02,
Track 7, Island of Hawaii, Hawaii (9 pages). Washington,
D.C.: IEEE Computer Society. Retrieved June 5, 2007,

Munoz, S. (2003, November 11). Nagging issue: Pitching


junk to kids. The Wall Street Journal Online. Retrieved
July 14, 2007, from http://online.wsj.com
Muris, T. J. M. (2002, October). The interface of competition and consumer protection. Paper presented at
the Fordham Corporate Law Institutes Twenty-Ninth
Annual Conference on International Antitrust Law and
Policy, New York.
Muscat, A. (2007, January 17). Perils of portable storage. Computer Reseller News. Retrieved http://www.gfi.
com/documents/32686_crn_eprint.pdf
MySpace. (2007). Retrieved December 9, 2007, from
http://www.myspace.com/
Nachimias, R., Mioduser, D., & Shelma, A. (2001).
Information and communication technologies usage by
students in an Israeli high school: Equity, gender, and
inside/outside school learning issues. Education and
Information Technologies, 6(1), 43-53.
Nakada, M., & Tamura, T. (2005). Japanese conceptions
of privacy: An intercultural perspective. Ethics and
Information Technology, 7(1), 27-36.
Nam, C., Song, C., Lee, E., & Park, C. (2005). Consumers
privacy concerns and willingness to provide marketing-related personal information online. Advances in
Consumer Research, 33, 212-217.



Compilation of References

Nash, J. M. (2002). The geek syndrome. Retrieved


July 5, 2007, from http://www.time.com/time/covers/1101020506/scaspergers.html
Nath, S. W. (2006). Relief for the e-patient? Legislative and judicial remedies to fill HIPAAs privacy gaps.
George Washington Law Review, 74, 532-540.
National Consumers League (USA). (n.d.). E-ssentials for
online privacy. Retrieved June 25, 2007, from http://www.
nclnet.org/technology/essentials/privacy.html
National Do Not Call Registry of 2003, H.R. 395, 108th
Congress, (2003).
National Office for the Information Economy (Australia).
(2003). Spamfinal report of the NOIE review of the
spam problem and how it can be countered. Canberra,
ACT: Department of Communication, Information
Technology and the Arts.
Norman, D. (1983). Design rules based on analysis
of human error. Communications of the ACM, 26(4),
254-258.
North American Consumer Project on Electronic Commerce (NACPEC). (2006). Internet consumer protection
policy issues. Geneva: The Internet Governance Forum
(IGF).
Noteberg, A., Christiaanse, E., & Wallage, P. (2003).
Consumer trust in electronic channels. E-Service Journal, 2(2), 46-67.
Nowak, G. J., & Phelps, J. (1992). Understanding privacy
concerns: An assessment of consumers information
related knowledge and beliefs. Journal of Direct Marketing, 6(Autumn), 28-39.
NSW Office of Fair Trading. (2003). International consumer rights: The world view on international consumer
rights. Retrieved November 15, 2006, from http://www.
fairtrading.nsw.gov.au/shopping/shoppingtips/internationalconsumerrights.html
Nyshadham, E. A. (2000). Privacy policy of air travel
web sites: a survey and analysis. Journal of Air Transport
Management, 6(3), 143-152.



OBrien, K., & Crampton, T. (2007). EU asks Google


to explain data retention policies. International HeraldTribune. Retrieved July 10, 2007, from http://www.iht.
com/articles/2007/05/25/business/google.php
ODonnell v. Blue Cross Blue Shield of Wyo., 173 F.
Supp. 2d 1176, 1179-80 (D. Wyo. 2001).
OGrady, et. al. v. Superior Court (2006), 139 Cal. App.
4Th 1423.
ONeil, D. (2001). Analysis of internet users level of
online privacy concerns. Social Science Computer
Review, 19(1), 17-31.
OReilly, J. T. (1982). Regaining a confidence: Protection of business confidential data through reform of the
freedom of information act. Administrative Law Review,
34, 263-313.
OSullivan, P. B. (2000). What you dont know wont hurt
me: Impression management functions of communication channels in relationships. Human Communication
Research, 26(3), 403-431.
Oakes, C. (1999). Mouse pointer records clicks. Wired
News. Retrieved June 28, 2006, from http://www.wired.
com/news/print/0,1294,32788,00.html
Oberndorf, S. (1998). Users remain wary. Multichannel
Merchant. Retrieved July 14, 2007, from http://multichannelmerchant.com/news/marketing_users_remain_wary/
OECD. (2000). OECD Guidelines for consumer protection in the context of electronic commerce. Paris:
OECD.
OECD. (2001). Australiaannual report on consumer
policy development 2001. Retrieved March 11, 2005, from
http://www.oecd.org/dataoecd/33/45/1955404.pdf
OECD. (2001). OECD guidelines on the protection of
privacy and transborder flows of personal data. Paris:
OECD.
OECD. (2003). Report on compliance with, and enforcement of, privacy protection online. Paris: OECD.

Compilation of References

OECD. (2006). Protecting consumers from cyberfraud.


Paris: OECD.
Office of Health Information Technology (2004). Retrieved June 21, 2001 from http://www.Whitehouse.
gov/omb/egov/gtob/health_informatics.htm
Ohio Rev. Code Ann. 2743.43 (A)-(D) (LexisNexis 2007)
Expert testimony on liability issues. Retrieved June 1,
2007, from LexisNexis Ohio Cases.
Okamura, H. (2004). Privacy protection and rights to
use privacy information in secure environment. Japan:
Foundation Internet Association.
Olivero, N., & Lunt, P. (2004). Privacy versus willingness to disclose in e-commerce exchanges: The effect of
risk awareness on the relative role of trust and control.
Journal of Economic Psychology, 25(2), 243-262.
Ono, H., & Zavodny, M. (2003). Gender and the internet.
Social Science Quarterly, 84(1), 111-121.
Ono, H., & Zavodny, M. (2005). Gender differences in
information technology usage: A U.S.-Japan comparison.
Sociological Perspectives, 48(1), 105-133.
OPA. (2005). Online privacy alliance: guidelines for
online privacy policies. Retrieved March 25, 2005, from
http://www.privacyalliance.org/resources/ppguidelines.
shtml
Oren, J. S. T. (2003). International jurisdiction over
consumer contracts in e-Europe. International and
Comparative Law Quarterly, 52, 665.
Osterman Research Inc. (2003). The impact of regulations
on email archiving requirements. ORI white paper sponsored by Information Management Research. Retrieved
October 2007, from http://www.Ostermanresearch.
com/whitepapers/or_imr01.pdf
Ostrov, B. (2007, March 14). Menlo Park teens suicide
shines light on shadowy market. The Mercury News.
Retrieved September 18, 2007, from LexisNexis News
library.
Ou, G. (2007) Wireless LAN security myths that will
not die. ZDNet. Retrieved July 2007, from http://blogs.
zdnet.com/Ou/?p=454

Padilla, R. (2007). Root out data breach dangers by first


implementing common sense. TechRepublic. Retrieved
July 2007, from http://blogs.techrepublic.com.com/techmanager/?p=312
Pai, A. K., & Basu, S. (2007). Offshore technology
outsourcing: overview of management and legal issues.
Business Process Management Journal, 13(1), 21-46.
Palmgreen, P., Wenner, L. A., & Rosengren, K. E. (1985).
Uses and gratifications research: The past ten years. In
K. E. Rosengren, L. A. Wenner, & P. Palmgreen (Eds.),
Media gratifications research: Current perspective (pp.
11-37). Beverly Hills, CA: Sage.
Pan, Y., & Zinkhan, G. (2006). Exploring the impact of
online privacy disclosures on consumer trust. Journal
of Retailing, 82(4), 331-338.
Papacharissi, Z. (2004, May). The blogger revolution?
Audiences as media producers. Paper presented at the
annual convention of the International Communication
Association, New Orleans, LA.
Papacharissi, Z. (2002). The self online: The utility of
personal home pages. Journal of Broadcasting &Electronic Media, 46(3), 346-368.
Papazafeiropoulou, A., & Pouloudi, A. (2001). Social
issues in electronic commerce: Implications for policy
makers. Information Resources Management Journal,
14(4), 24-32.
Parameswaran, M., & Whinston, A. (2007). Research
issues in social computing. Journal of the Association
for Information Systems, 8(6), 336-350.
Parker, B. J., & Plank, R. E. (2000). A use and gratifications perspective on the internet as a new information
source. American Business Review, 18(2), 43-49.
Patel, A., & Lindley, A. (2001). Resolving online disputes: Not worth the bother? Consumer Policy Review,
11(1 (Jan/Feb)), 2-5.
Patients Rights Conditions of Participation (Revised
2007). 42 C.F. R. 482.13, 482.24 and 482.51.



Compilation of References

Patton, M., & Josang, A. (2004). Technologies for trust


in electronic commerce. Electronic Commerce Research,
4(1-2), 9-21.
PC Magazine. (2005). The perils of online shopping. PC
Magazine, 24(14), 23.
Perritt, Jr., H. H. (1995). Sources of rights to access public
information. William & Mary Bill of Rights Journal,
4(Summer), 179-221.
Perritt, Jr., H. H. (1998). Electronic freedom of information. Administrative Law Review, 50(2), 391-419.
Perritt, Jr., H. H., & Lhulier, C. J. (1997). Information
access rights based on international human rights law.
Buffalo Law Review, 45(Fall), 899-929.
Pettus v. Cole, 57 Cal. Rptr. 2d 46 (Cal. Ct. App.
1996).
Petty, R. D., & Hamilton, J. (2004). Seeking a single policy
for contractual fairness to consumers: A comparison of
U.S. and E.U efforts. The Journal of Consumer Affairs,
38(1), 146-166.
Pfitzmann, A., & Khntopp, M. (2000). Anonymity, unobservability, and pseudonymitya proposal for terminology. In H. Federrath (Ed.), Workshop on Design Issues in
Anonymity and Unobservability. Springer Verlag.
Pfitzmann, A., & Khntopp, M. (2000). Anonymity,
unobservability, and pseudonymitya proposal for
terminology. In H. Federrath (Ed.), Designing privacy
enhancing technologies, Proceedings of the Workshop on
Design Issues in Anonymity and Observability, Berkeley,
California. Lecture Notes in Computer Science (Vol.
2009, pp. 1-9). Heidelberg, Germany: Springer.
Phelps, J. E., DSouza, G., & Nowak, G. J. (2001). Antecedents and consequences of consumer privacy concerns: An empirical investigation. Journal of Interactive
Marketing, 15(4), 2-17.
Phelps, J. E., Nowak, G. J., & Ferrell, E. (2000). Privacy
concerns and consumer willingness to provide personal
information. Journal of Public Policy & Marketing,
19(1), 27-41.



Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy


concerns and consumer willingness to provide personal
information. Journal of Public Policy & Marketing,
19(1), 27-41.
Pillar, C. (2001, November 7). Web mishap: Kids psychological files posted. L.A. Times, Nov., A1.
Plichtova, J., & Brozmanova, E. (1997). Social representations of the individual and the community well-being:
Comparison of the empirical data from 1993 and 1995.
Sociologia, 29(4), 375-404.
Policy Aware Web. (2006). Retrieved July 10, 2007, from
http://www.policyawareweb.org/
Postmes, T., Spears, R., Sakhel, K., & de Groot, D. (2001).
Social influence in computer-mediated communication:
The effects of anonymity on group behavior. Personality
and Social Psychology Bulletin, 27, 1243-1254.
Prime, J. S. (1996). A double-barrelled assault: How
technology and judicial interpretations threaten public
access to law enforcement records. Federal Communications Law Journal, 48(March), 341-369.
Pritts, J. L. (2002). Altered states: State health privacy
laws and the impact of the federal health privacy rule. Yale
Journal of Health Policy, Law & Ethics, 2, 334-340.
Privacy & American Business. (2002). Privacy on and off
the internet: what consumers want.Privacy & American
Business. Hackensack, NJ.
Privacy Act. (2004). Washington, D.C.: U.S. Environmental Protection Agency. Retrieved June 5, 2007, from
http://www.epa.gov/privacy/
Privacy and Identity Management for Europe. (Prime).
(2007). Retrieved July 10, 2007, from https://www.
prime-project.eu/
Privacy Bird Tour. (2007). Retrieved June 5, 2007, from
http://www.privacybird.org/tour/1_3_beta/tour.html
Privacy Commissioner (Australia), O. o. (2000). National
privacy principles (extracted from the privacy amendment (private sector) act 2000). Retrieved July 19, 2006,
from http://www.privacy.gov.au/publications/npps01.
html

Compilation of References

Privacy Commissioner (Australia), O. o. (2003). National


privacy principle 7identifiers in the health sector. Sydney: Privacy Commissioner, Office of (Australia).
Privacy Commissioner (Australia), O. o. (2006a). Annual
report 2005-2006. Melbourne, Victoria: Office of the
Victorian Privacy Commissioner.
Privacy Commissioner (Australia), O. o. (2006b). Industry
standard for the making of telemarketing calls. Sydney,
NSW: Office of the Privacy Commissioner (Australia).
Privacy Commissioner (Australia), O. o. (n.d.). State &
territory privacy laws. Retrieved August 2, 2005, from
http://www.privacy.gov.au/privacy_rights/laws/index.
html#2
Privacy Commissioner of Canada. (2000). The Personal Information Protection and Electronic Documents
Act.
Privacy online: A report to congress. (1998). Washington,
D.C.: U.S. Federal Trade Commission.
Privacy Rights Clearinghouse. (2006). HIPAA basics:
Medical privacy in the electronic age. Retrieved May
23, 2007, from http://www.privacyrights.org/fs/fs8ahipaa.htm
Privacy Rights Clearinghouse. (2007). A chronology of
data breaches. Retrieved October 2007, from http://www.
privacyrights.org/ar/ChronDataBreaches.htm
Privacy Rights Clearinghouse. (2007). A chronology of
data breaches. Retrieved on May 14, 2007, from http://
www.privacyrights.org/ar/ChronDataBreaches.htm
Provos, N., McNamee, D., Mavrommatis, P., Wang,
K., & Modadugu, N. (2007). The ghost in the browser
analysis of web-based malware. Google, Inc. Retrieved
http://www.usenix.org/events/hotbots07/tech/full_papers/Provos/Provos.pdf
Public Welfare. 45 C.F.R. 160 et seq.
Pujol, J. M., Sangesa, R., & Delgado, J. (2002). Extracting
reputation in multi agent systems by means of social network topology. In Proceedings of the First International
Joint Conference on Autonomous Agents and Multiagent
Systems, AAMAS 02, Bologna, Italy (pp. 467-474). New

York: ACM Press. Retrieved June 5, 2007, from citeseer.


ist.psu.edu/pujol02extracting.html
Purpose of Privacy mark system. (2007, April 9). Retrieved from Privacy Mark: http://privacymark.jp/privacy_mark/about/outline_and_purpose.html
Qualys. (2006). The laws of vulnerabilities: Six axioms
for understanding risk. Retrieved October 2007, from
http://developertutorials-whitepapers.tradepub.com/
free/w_qa02/pf/w_qa02.pdf
Quirk, P., & Forder, J. (2003). Electronic commerce
and the law (2nd ed.). Queensland: John Wiley & Sons
Australia, Ltd.
Quo, S. (2004). Spam: Private and legislative responses
to unsolicited electronic mail in Australia and the United
States. ELawMurchdoch University, 11(1).
Qutami, Y., & Abu-Jaber, M. (1997). Students self-efficacy in computer skills as a function of gender and
cognitive learning style at Sultan Qaboos University.
International Journal of Instructional Media, 24(1),
63-74.
Raab, C. D., & Bennett, C. J. (1998). The distribution of
privacy risks: Who needs protection? The Information
Society, 14, 263-274.
Radcliff, D. (2001). Giving users back their privacy.
ComputerWorld. Retrieved February 28, 2002, from
http://www.computerworld.com/storyba/0,4125,NAV47_
STO61981,00.html
Radin, T. J., Calkins, M., & Predmore, C. (2007). New
challenges to old problems: Building trust in e-marketing.
Business and Society Review, 112(1), 73-98.
Ranganathan, C., & Ganapathy, S. (2002). Key dimensions
of business-to-consumer web sites. 39(6), 457-465.
Recording Industry Association of America Inc. v.
Verizon Internet Services, Inc., Volume Number 037015, (2003).
Regan, K. (2001). Does anyone read online privacy policies? Ecommerce Times. Retrieved October 2007, from
http://www.ecommercetimes.com/perl/story/11303.



Compilation of References

htmlthefreecountry.com (2007). Free anonymous surfing. Retrieved October 25, 2007, from http://www.
thefreecountry.com/security/anonymous.shtml
Reinard, J. C. (2006). Communication research statistics.
CA: Sage.
Reips, U.-D. (2007). Internet users perceptions of
privacy concerns and privacy actions. International
Journal of Human-Computer Studies 65(6), 526-536.
Reiter, M., & Rubin, A. (1999). Anonymous web transactions with crowds. Communications of the ACM,
42(2), 32-48.
Reiter, M., & Rubin, A. (1999). Crowds: Anonymity for
web transactions. Communications of the ACM, 42(2),
32-48.
Report to Congressional Requestors. (2005). Information
security emerging cybersecurity issues threaten federal
information systems. Retrieved December 12, 2006, from
http://www.gao.gov/new.items/d05231.pdf
Rescorla, E. (2005). Is finding security holes a good idea?
IEE Security and Privacy, 3(1), 14-19.
Reuters. (1999). Teen pleads guilty to government
hacks. Retrieved October 13, 2007, from http://scout.
wisc.edu/Projects/PastProjects/net-news/99-11/99-1123/0007.html
Rice, S. (2000). Public environmental recordsa
treasure chest of competitive information. Competitive
Intelligence Magazine, 3(3), 13-19.
Richards, N. M. (2006). Reviewing the digital person:
Privacy and technology in the information age by Daniel
J. Solove. Georgetown Law Journal, 94, 4.
Richardson, R. (2007). 2007 CSI Computer Crime and
Security Survey. Retrieved October 12, 2007, from
http://www.GoCSI.com
Riegelsberger, J., Sasse, M., & McCarthy, J. (2003). Shiny
happy people building trust?: Photos on e-commerce
websites and consumer trust. In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems (pp. 121-128). Ft. Lauderdale, FL.

8

Rivest, R. L., Shamir, A., & Adleman, L. A. (1978). A


method for obtaining digital signatures and public-key
cryptosystems. Communications of the ACM, 21(2),
120-126.
Roach, R. (2001). Internet usage reflects gender breakdown. Black Issues in Higher Education, 18(11), 119.
Roebuck, W. (2002). Jurisdiction and e-commerce.
Computer and Telecommunications Law Review, 8(2),
29-32.
Rogers, E. M. (1995). Diffusion of innovations (4th ed.).
New York: Free Press.
Rohm, A. J., & Milne, G. R. (1998). Emerging marketing
and policy issues in electronic commerce: Attitudes and
beliefs of internet users. In A. Andreasen, A. Simonson,
& N. C. Smith (Eds.), Proceedings of Marketing and
Public Policy (Vol. 8, pp. 73-79).Chicago: American
Marketing Association.
Roman, S. (2007). The ethics of online retailing: A scale
development and validation from the consumers perspective. Journal of Business Ethics, 72(2), 131-148.
Rosenblatt, B., Trippe, B., & Mooney, S. (2002). Digital
rights management: Business and technology. New York:
M&T Books.
Rosenzweig, P., Kochems, A., & Schwartz, A. (2004).
Biometric technologies: Security, legal, and policy implications. NASSP Legal Memorandum, 12, 1-10.
Ross, R. (2007). Robotic insect takes off. Researchers
have created a robotic fly for covert surveillance. Technology Review, July 19. Retrieved July 20, 2007, from
http://www.technologyreview.com/Infotech/19068/
Rothberg, H. N., & Erickson, G. S. (2005). From knowledge to intelligence: Creating competitive advantage in
the next economy. Woburn, MA: Elsevier ButterworthHeinemann.
Round, D. K., & Tustin, J. (2004, September). Consumers
as international traders: Some potential information issues for consumer protection regulators. Paper presented
at the International Trade Law Conference, AttorneyGenerals Department, Canberra, ACT.

Compilation of References

Roussous, G., & Moussouri, T. (2004). Consumer


perceptions of privacy, security and trust in ubiquitous
commerce. Pers Ubiquit Comput, 8, 416-429.

Sandoval, G. (2006). Veterans data swiped in theft.


CNET News.com. Story last modified Mon May 22
16:55:51 PDT 2006.

Roy Morgan Research. (2004). Community attitudes


towards privacy 2004. Sydney, NSW: The Office of the
Federal Privacy Commissioner (Australia).

Sasahara, E. (2005, Sept. 29). Privacy data lost in


financial business. Retrieved from IT pro: http://itpro.nikkeibp.co.jp/article/SMBIT/20050928/221820/
?ST=management

Royal, C. (2005). A meta-analysis of journal articles intersecting issues of internet and gender. Journal of Technical
Writing and Communication, 35(4), 403-429.
Ruddock, P. (2006). Australian law reform commission to
review privacy act. Retrieved June 7, 2007, from hhttp://
www.ag.gov.au/agd/WWW/MinisterRuddockHome.
nsf/Page/Media_Releases_2006_First_Quarter_31_
January_2006_-_Australian_Law_Reform_Commission_to_review_Privacy_Act_-_0062006#
Rule, C. (2002). Online dispute resolution for business.
San Francisco: Jossey-Bass.
Russ, A. (2001). Digital rights management overview.
SANS Information Security Reading Room, July 26.
Retrieved April 16, 2007, from http://www.sans.org/reading_room/whitepapers/basics/434.php
Saarenp, T., & Tiainen, T. (2003). Consumers and ecommerce in information system studies. In M. Hannula,
A-M. Jrvelin, & M. Sepp (Eds.), Frontiers of e-business
research: 2003 (pp. 62-76). Tampere: Tampere University
of Technology and University of Tampere.
Sabater, J., & Sierra, C. (2002). Social ReGreT, a reputation model based on social relations. ACM SIGecom
Exchanges, 3(1), 44-56.
Sagan, S. (2000). Hacker women are few but strong. Retrieved August 5, 2004, from http://abcnews.go.com/sections/tech/DailyNews/hackerwomen000602.html#top
Salehnia, A. (2002). Ethical issues of information systems.
Hershey, PA: Idea Group Incorporated.
Salganik, M. W. (2000, August 10). Health data on 858
patients mistakenly emailed to others; medical information was among messages sent out by Kaiser Health Care.
Baltimore Sun, Aug., 1C.

Sasahara, E. (2005, June 29). Security hole of privacy


information database. Retrieved from IT pro: http://itpro.nikkeibp.co.jp/free/smbit/smbit/20050629/163612/
?ST=smb
Savage, M. (2007, October 23). Proposed legislation
would strengthen cybercrime laws. Retrieved November 2007, from http://searchsecurity.techtarget.
com/originalContent/0,289142,sid14_gci1278341,00.
html?track=sy160
Savicki, V., Kelley, M., & Lingenfelter, D. (1996). Gender
and small task group activity using computer-mediated
communication. Computers in Human Behavior, 12,
209-224.
Savvas, A. (2005, March 22). Monitoring made harder
by cookie security fears, Computer Weekly. Retrieved
July 14, 2007, from
Sawyer, S., & Tapia, A. (2005). The sociotechnical nature
of mobile computing work: Evidence from a study of
policing in the United States. International Journal of
Technology and Human Interaction, 1(3), 1-14.
Schell, B. H., & Lanteigne, N. (2000). Stalking, harassment, and murder in the workplace: Guidelines for protection and prevention. Westport, CT: Quorum Books.
Schell, B. H., & Martin, C. (2006). Websters new world
hacker dictionary. Indianapolis, IN: Wiley Publishing,
Inc.
Schell, B. H., Dodge, J. L., & Moutsatos, S. S. (2002).
The hacking of America: Whos doing it, why, and how.
Westport, CT: Quorum Books.
Schell, B.H. (2007). Contemporary world issues: The
internet and society. Santa Barbara, CA: ABC-CLIO.

9

Compilation of References

Schellekens, M. H. M. (2004). Privacy and electronic


signatures: are they compatible? Computer and Telecommunications Law Review, 10(7), 182-186.
Schmidt v. Dept of Veterans Affairs, 218 F.R.D. 619
(E.D. Wis. 2003).
Schmidt, C. W. (2000). Patient health information goes
electronic. Retrieved July 2, 2007, from http://pubs.acs.
org/subscribe/journals/mdd/v03/i09/html/rules.html
Schneier, B. (1999). Biometrics: Uses and abuses. Communications of the ACM, 42(8), 136.
Schoder, D., & Yin, P. (2000). Building firm trust online.
Communications of the ACM, 43(12), 73-79.
Schuman, E. (2006). Gartner: $2 billion in e-commerce
sales lost because of security fears. eWeek.com, November
27. Retrieved October 15, 2007, from http://www.eweek.
com/article2/0,1895,2063979,00.asp
Schuman, E. (2007, November 14). TJMaxxs projected
breach costs increase to $216M. eWEEK. Retrieved
November 2007, from http://fe42.news.sp1.yahoo.com/
s/zd/20071114/tc_zd/219495

Seigneur, J.-M., & Jensen, C. D. (2004). Trading privacy


for trust. In T. Dimitrakos (Ed.), Proceedings of the
Second International Conference on Trust Management,
iTrust 2004, Oxford, United Kingdom. Lecture Notes in
Computer Science (Vol. 2995, pp. 93-107). Heidelberg,
Germany: Springer.
Sensor Nation. (2004). [Special Report]. IEEE Spectrum,
41(7).
Serjantov, A., & Danezis, G. (2003). Towards an information theoretic metric for anonymity. In R. Dingledine &
P. F. Syverson. (Eds.), Proceedings of the 2nd International Workshop on Privacy Enhancing Technologies,
PET 2002, San Francisco, California. Lecture Notes in
Computer Science (Vol. 2482, pp. 259-263). Heidelberg,
Germany: Springer.
Sexton, R., Johnson, R., & Hignite, M. (2002). Predicting internet/e-commerce use. Internet Research, 12(5),
402-410.
Shah, R., & Piper, M. H. (2007). Wilson disease. Retrieved
June 4, 2007, from eMedicine-Http://www.emedicne.
com/med/topic2413.htm

Schwaig, K. S., Kane, G. C., & Storey, V. C. (2005).


Privacy, fair information practices, and the Fortune 500:
the virtual reality of compliance. The DATA BASE for
Advances in Information Systems, 36(1).

Shalhoub, Z. (2006). Trust, privacy, and security in


electronic business: The case of the GCC countries.
Information Management & Computer Security, 14(3),
270-283.

Scott, C. (2004). Regulatory innovation and the online


consumer. Law & Policy, 26(3-4), 477-506.

Sharrott, D. (1992). Provider-specific quality-of-care data:


A proposal for limited mandatory disclosure. Brooklyn
Law Review, 58(Spring), 85-153.

Scott, R. (2001). Health care symposium: E-health: The


medical grontier: Cybermedicine and virtual pharmacies.
West Virginia Law Review, 103, 412-413.
Scottish Consumer Council. (2001). E-commerce and
consumer protection: Consumersreal needs in a virtual
world. Glasgow: Scottish Consumer Council.
Seddigh, N., Pieda, P., Matrawy, A., Nandy, B., Lombadaris, J., & Hatfield, A. (2004). Current trends and
advances in information assurance metrics. Retrieved
October 10, 2007, from http://dev.hil.unb.ca/Texts/PST/
pdf/seddigh.pdf

0

Shaw, E., Post, J., & Ruby, K. (1999). Inside the mind of
the insider. Security Management, 43(12), 34-44.
Shaw, K., & Rushing, R. (2007). Podcast, Keith Shaw
(NetWorkWorld) talks with Richard Rushing chief security
officer at ... data, listen to this podcast. Retrieved October 2007, from http://www.networkingsmallbusiness.
com/podcasts/panorama/2007/022807pan-airdefense.
html?zb&rc=wireless_sec
Sheehan, K. B. & Hoy, M. G. (1999). Flaming, complaining, abstaining: How online users respond to privacy
concerns. Journal of Advertising, 28(3), 37-51.

Compilation of References

Sheehan, K. B. (2002). Toward a typology of internet


users and online privacy concerns. The Information
Society, 18(1), 21-32.

Smith, H. J. (1996). Information privacy: Measuring


individuals concerns about organizational practices.
MIS Quarterly, 20(2), 167-196.

Sheehan, K. B. (2002). Toward a typology of Internet


users and online privacy concerns. The Information
Society, 18, 21-32.

Smith, H. J., Milberg, S. J., & Burke, S. J. (1996). Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly, June,
167-196.

Sheehan, K. B., & Hoy, M. G. (1999). Flaming, complaining, abstaining: How online users respond to privacy
concerns. Journal of Advertising, 28(3), 37-51.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions of
privacy concern among online consumers. Journal of
Public Policy & Marketing, 19(1), 62-73.
Shimeall, T. (2001, August 23). Internet fraud, Testimony
of Timothy J. Shimeall, Ph.D. CERT, Analysis Center
Software Engineering Institute, Carnegie Mellon University Pittsburgh, PA; Before the Pennsylvania House
Committee on Commerce and Economic Development,
Subcommittee on Economic Development, retrieved
October 2007, available http://www.CERT. org/congressional_testimony/Shimeall_testimony_Aug23.html
Shimp, T. (2007). Advertising, promotion, and other
aspects of integrated marketing communications (7th
ed.). Mason, OH: Thomson South-Western.
Shinder, D. (2007, February 9). How SMBs can enforce
user access policies. Retrieved April 2007, from http://
articles.techrepublic.com.com/5100-1009_11-6157054.
html?tag=nl.e101
Singh, B. (2002). Consumer education on consumer rights
and responsibilities, code of conduct for ethical business, importance of product labelling. Kualar Lumpur:
Consumers International.
Sitton, J. V. (2006). When the right to know and the right
to privacy collide. Information Management Journal,
40(5), 76-80.
Sixsmith, J., & Murray, C. D. (2001). Ethical issues in the
documentary data analysis of internet posts and archives.
Qualitative Health Research, 11(3), 423-432.
Smith, H., Milburg, S., & Burke, S. (1996). Information
privacy: Measuring individuals concerns about organizational practices. MIS Quarterly, 20(2), 167-196.

Smith, H. J., Milberg, S. J., & Burke, S. J. (1996).


Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly, 167195.
Smith, H. J., Milburg, S. J. & Burke, S. J. (1996). Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly, 20(2),
167-196.
Smith, L. (2004). Global online shopping: How well
protected is the Australian consumer? Australian Competition and Consumer Law Journal, 12(2), 163-190.
Smith, R. (2002). Microsoft response to the windows
media player 8 privacy advisory. Retrieved July 7, 2007,
from http://www.computerbytesman.com/privacy/wmp8response.htm
Smith, R. (2002). Serious privacy problems in windows
media player for Windows XP. Retrieved July 7, 2007,
from http://www.computerbytesman.com/privacy/wmp8dvd.htm
Smith, R. E. (2006). Laptop hall of shame. Forbes.com.
Commentary on Forbes.com September 7, 2006.
So, M., & Sculli, D. (2002). The role of trust, quality,
value and risk in conducting e-business. Industrial Management & Data Systems, 102(8/9), 503-512.
Solove, D. J. (2005). The coexistence of privacy and
security: Fourth amendment codification and professor
kerrs misguided call for judicial deference. Fordham
Law Review, 74(November), 747.
Song, I., LaRose, R., Eastin, M. S., & Lin, C. A. (2004).
Internet gratifications and internet addiction: On the uses
and abuses of new media. CyberPsychology &Behavior,
7(4), 384-394.



Compilation of References

Spector, L. (2003, December 26). Guide to online


photo album sites: Heres how to post and share digital
memories of your holidays. PC World. Retrieved April
14, 2004, from http://www.pcworld.com/news/article/
0,aid,114040,00.asp
Stajano, F., & Anderson, R. (1999). The cocaine auction
protocol: On the power of anonymous broadcast. In A.
Pfitzmann (Ed.), Information Hiding Workshop 1999
(pp. 434-447). Springer Verlag.
Stanton, P. (2004). Securing data in storage: A review
of current research. CoRR, cs.OS/0409034.
Stapleton-Paff, K. (2007). Facebook poses privacy issues
for students. The Daily of the University of Washington.
Retrieved October 2, 2007, from http://www.thedaily.
washington.edu/article/2007/4/25/facebookPosesPrivacyIssuesForStudents
State of California. (2007). Privacy laws. Retrieved
September 26, 2007, from http://www.privacy.ca.gov/
lawenforcement/laws.htm
Stokes, M. (2007). Police charge three female students
for sign stealing. Western Herald. Retrieved October 22,
2007 from http://media.www.westernherald.com/media/
storage/paper881/news/2007/10/22/News/Police.Charge.
Three.Female.Students.For.Sign.Stealing-3046592.
shtml
Stone, E. F., Gardner, D. G., Gueutal, H. G., & McClure,
S. (1983). A field experiment comparing informationprivacy values, beliefs, and attitudes across several
types of organizations. Journal of Applied Psychology,
68(3), 459-468.
Stoney, M. A. S., & Stoney, S. (2003). The problems of
jurisdiction to e-commercesome suggested strategies.
Logistics Information Management, 16(1), 74-80.
Strauss, J., El-Ansary, A., & Frost, R. (2006). E-marketing (4th ed.). Upper Saddle River, NJ: Pearson Prentice
Hall.
Suh, B., & Han, I. (2003). The impact of customer trust
and perception of security control on the acceptance of
electronic commerce. International Journal of Electronic
Commerce, 7(3), 135-161.



Sullivan, B. (2007). Spam is back, and worse than ever.


Retrieved March 20, 2007, from http://redtape.msnbc.
com/2007/01/spam_is_back_an.html
Sultan, F., & Rohm, A. (2005). The coming era of brand
in the hand marketing. MIT Sloan Management Review,
47(1), 82-90.
Summary of the HIPAA Privacy Rule. (2003). Washington, D.C.: The U.S. Department of Health and Human
Services.
Summary of the HIPPA Privacy Rules by HIPPA Compliance Assistance. (2005). Health & Human Services,
May.
Susman, T. M. (1988). The privacy act and the freedom of
information act: Conflict and resolution. John Marshall
Law Review, 21, 703-733.
Suzuki, T. (2004, May 31). Overview of Yahoo! BB
data leak. Retrieved from IT pro, http://itpro.nikkeibp.
co.jp/free/NC/NEWS/20040531/145132/
Swanson, D. L. (1992). Understanding audiences: Continuing contributions of gratifications research. Poetics,
21, 305-328.
Swartz, N. (2006). HIPPA compliance declines, survey
says. Information Management Journal, 40(4), 16.
Sweeney, L. (2002). Achieving k-anonymity privacy
protection using generalization and suppression. International Journal on Uncertainty, Fuzziness and
Knowledge-based Systems, 10(5), 571-588.
Sweeney, L. (2001). Computational disclosure control:
A primer on data privacy protection. Ph.D. thesis, Massachusetts Institute of Technology.
Sweeney, L. (1998). Datafly: A system for providing
anonymity in medical data. In T. Y. Lin & S. Qian (Eds.),
Database security XI: Status and prospects. IFIP TC11
WG11.3 Eleventh International Conference on Database
Security, Lake Tahoe, California (pp. 356-381). Amsterdam, The Netherlands: Elsevier Science.
Sweeney, L. (2001). Information explosion. In L. Zayatz,
P. Doyle, J. Theeuwes, & J. Lane (Eds.), Confidentiality,

Compilation of References

disclosure, and data access: Theory and practical applications for statistical agencies (26 pages). Washington,
D.C.: Urban Institute. Retrieved June 5, 2007, from http://
privacy.cs.cmu.edu/people/sweeney/explosion2.pdf

Taylor, P. A. (2003). Maestros or misogynists? Gender


and the social construction of hacking. In Y. Jewkes (Ed.),
Dot.cons: Crime, deviance and identity on the Internet
(pp. 125-145). Portland, OR: Willan Publishing.

Sweeney, L. (2002). K-anonymity: a model for protecting


privacy. International Journal on Uncertainty, Fuzziness
and Knowledge-based Systems, 10(5), 557570.

Taylor, R. W., Caeti, T. J., Loper, D. K., Fritsch, E.


J., & Liederback, J. (2006). Digital crime and digital
terrorism. Upper Saddle River, NJ: Pearson Prentice
Hall. Telemedical Report to Congress, 104th Cong., 2d
Session (1996).

Sweeney, L. (1996). Replacing personally-identifying


information in medical records, the scrub system. In J.
J. Cimino (Ed.), Proceedings of the American Medical
Informatics Association (pp. 333-337). Washington, D.C.:
Hanley & Belfus. Retrieved June 5, 2007, from http://privacy.cs.cmu.edu/people/sweeney/scrubAMIA1.pdf
Symantec. (2006, September 19). Symantec finds firms
recognize importance of application security, yet lack
commitment in development process. News release.
http://www.symantec.com/about/news/release/article.
jsp?prid=20060919_01
Symptoms of Cushing Syndrome. (2007). WrongDiagnosis.com. Retrieved June 3, 2007, from http://www.
wrongdiagnosis.com/c/cushings_syndrome/symptoms.
htm
Syukunan, T. (2007). Analysis for efficiency of anti-spam
ware. Media Communications (Japanese), 57, 2-10.
Tahara, M., & Yokohari, K. (2005). Warning of usage of
privacy information. Tokyo: Asa.

Teo, T. S. H. (2002). Attitudes toward online shopping


and the internet. Behaviour & Information Technology,
21(4), 259-271.
Teo, T. S. H., Lim, V. K. G., & Lai, R. Y. C. (1999).
Intrinsic and extrinsic motivation in internet usage.
OMEGA: International Journal of Management Science, 27, 25-37.
Teo, T., & Lim, V. (1997). Usage patterns and perceptions
of the internet: The gender gap. Equal Opportunities
International, 16(6/7), 1-8.
Tesler, W. (2000). Gould debunked: The prohibition
against using New Yorks freedom of information law
as a criminal discovery tool. New York Law School Law
Review, 44, 71-129.
The American Heritage Dictionary of the English Language (4th ed.). (2000). Boston: Houghton Mifflin.

Tavani, H. (1999). Privacy online. Computers and Society, 29(4), 11-19.

The CAN-SPAM Act: Requirements for Commercial


Emailers. (2004). Retrieved from http://www.ftc.gov/
bcp/conline/pubs/buspubs/canspam.htm

Tavani, H. T., & Grodzinsky, F. S. (2002). Cyberstalking,


personal privacy, and moral responsibility. Ethics and
Information Technology, 4(2), 123-133.

The Safe Internet Pharmacy Act of 2007, S.596, 110th


Congress, 1st Session (Fall 2007) Retrieved June 5, 2007,
from LexisNexisCongrssional database.

Tavani, H., & Moor, J. (2001). Privacy protection, control


of information, and privacy-enhancing technologies.
Computers and Society, 31(1), 6-11.

Thomas, D. (2002). Hacker culture. Minneapolis, MN:


University of Minnesota Press.

Taylor, C. R., Franke, G. R., & Marynard, M. L. (2000).


Attitudes toward direct marketing and its regulation: A
comparison of the United States and Japan. Journal of
Public Policy & Marketing, 19(2), 228-237.

Thompson, B. (2003, February 21). Is Google too powerful? Retrieved December 12, 2006, from http://news.
bbc.co.uk/2/hi/technology/2786761.stm
Timothy, R. (1999). The construction of the world wide
web audience. Media, Culture & Society, 21(5), 673684.



Compilation of References

Top Ten Ways to Protect Online Privacy. (2003). Retrieved


from http://www.cdt.org/privacy/guide/basic/topten.
html

Multimedia Association. Retrieved July 20, 2007, from


http://www.cni.org/docs/ima.ip-workshop/Tygar.Yee.
html

Traffic details for Google.com. (n.d.). Retrieved November


11, 2006, from http://www.alexa.com/data/details/traffic_details?q=www.google.com&url=google.com

Tyler, B. J. (1998). Cyberdoctors: The virtual housecallthe actual practice of medicine on the internet
is here; is it a telemedical accident waiting to happen?
Indiana Law Review, 13, 259-290.

Trammell, K. D., Williams, A. P., Postelincu, M., & Landreville, K. D. (2006). Evolution of online campaigning:
Increasing interactivity in candidate web sites and blogs
through text and technical features. Mass Communication & Society, 9(1), 21-44.
Treasury (Australia). (2006). The Australian guidelines
for electronic commerce (March 2006). Canberra, ACT:
Treasury (Australia)
TRUSTe. (2007). Retrieved December 9, 2007, from
http://www.truste.org/
Trustworthy Computing White Paper. (2003). Redmond,
Washington: Microsoft. Retrieved June 5, 2007, from
http://www.microsoft.com/mscorp/twc/twc_whitepaper.
mspx
Tschudin, C. (1999). Apoptosisthe programmed
death of distributed services. In J. Vitek & C. D. Jensen
(Eds.), Secure internet programming. Security issues
for mobile and distributed objects. Lecture Notes in
Computer Science (Vol. 1603, pp. 253-260). Heidelberg,
Germany: Springer.
Turkle, S. (1984). The second self: Computers and the
human spirit. New York:Simon and Schuster.
Turow, J. (2003). Americans & online privacy: The system
is broken. Report from the Annenberg Public Policy
Center of the University of Pennsylvania, June.
Turow, J., & Hennessy, M. (2007). Internet privacy and
institutional trust: Insights from a national survey. New
Media & Society, 9(2), 300-318.
Tygar, J. D., & Yee, B. (1994). Dyad: A system for using
physically secure coprocessors. In Proceedings of the
Joint Harvard-MIT Workshop Technological Strategies
for Protecting Intellectual Property in the Networked
Multimedia Environment. Annapolis, MD: Interactive



U.S. Census Bureau. (2007). The census bureau of the


department of commerce report on retail e-commerce
sales.
U.S. Senate Permanent Subcommittee on Investigations.
(1986). Security in cyberspace. Retrieved April 3, 2007,
from http://www.fas.org/irp/congress/1996_hr/s960605t.
htm
UCITA. (2002). Uniform Computer Information Transactions Act. National Conference of Commissioners on
Uniform State Laws.
Udo, G. J. (2001). Privacy and security concerns as major
barriers for e-commerce: A survey study. Information
Management & Computer Security, 9(4), 165-174.
Uhl, K. E. (2003). The freedom of information act post9/11: Balancing the publics right to know, critical infrastructure protection, and homeland security. American
University Law Review, 53(October), 261-311.
Ullman, E. (1997). Close to the machine: Technophilia
and its discontents. San Francisco: City Lights Books.
United Nations. (1948). Universal declaration of human
rights. New York: United Nations.
University of Miami Miller School of Medicine Privacy/
Data Protection Project. (2005). Privacy Standard/Rule
(HIPAA). Retrieved May 22, 2007, from http://privacy.
med.miami.edu/glossary/xd_privacy_stds_applicability.htm
Unze, D. (2007, August 3). Internet operation puts man in
prison. St. Cloud Times. Retrieved September 17, 2007,
from LexisNexis Current news file.
US Safe Web Act of 2006 enacted by 109th Congress March
16, 2006. (2006). Retrieved from http://www.govtrack.

Compilation of References

us/congress/bill.xpd?tab=summary&bill=s109-1608
USA Patriot Act of 2001 enacted October 23, 2001. (2001).
Retrieved from http://www.govtrack.us/congress/bill.
xpd?bill=h107-3162
USG. (1998). The Digital Millennium Copyright Act
of 1998, U.S. Copyright Office, Pub. 05-304, 112 Stat.
2860.
Vaile, D. (2004). Spam cannednew laws for Australia.
Internet Law Bulletin, 6(9), 113-115.
Van Dyke, T., Midha, V., & Nemati, H. (2007). The
effect of consumer privacy empowerment on trust and
privacy concerns in e-commerce. Electronic Markets,
17(1), 68-81.
Vasek, S. (2006). When the right to know and right
to privacy collide. Information Management Journal,
40(5), 76-81.
Vasiu, L., Warren, M., & Mackay, D. (2002, December).
Personal information privacy issues in B2C e-commerce:
a theoretical framework. Paper presented at the 7th Annual CollECTeR Conference on Electronic Commerce
(CollECTeR02), Melbourne, Victoria
Vaughn, R. G. (1984). Administrative alternatives and
the federal freedom of information act. Ohio State Law
Journal, 45(Winter), 185-214.
Vivisimo. (2006). Restricted access: Is your enterprise search solution revealing too much? Retrieved
October 2007, from via http://Vivisimo.com/ or
http://www.webbuyersguide.com/bguide/whitepaper/
wpDetails.asp_Q_wpId_E_NzYyMQ
Vogelstein, F. (2007, April 9). Text of wireds interview
with Google CEO Eric Schmidt. Retrieved July 15,
2007, from http://www.wired.com/techbiz/people/
news/2007/04/mag_schmidt_trans?curren
W3C (2002b). The platform for privacy preferences
1.0 (P3P1.0) specification. W3C P3P Working Group.
Retrieved April 16, 2007, from http://www.w3.org/TR/
P3P/
W3C. (2002). The platform for privacy preferences 1.0
(P3P1.0) specification. Retrieved October 25, 2007, from
http://www.w3.org/TR/P3P/

W3C. (2002a). How to create and publish your companys


P3P policy in 6 easy steps. W3C P3P Working Group.
Retrieved April 16, 2007, from http://www.w3.org/P3P/
details.html
Wachter, G. W. (2001). Law and policy in telemedicine:
HIPAAs privacy rule summarizedwhat does it mean
for telemedicine? Retrieved May 30, 2007, from http://tie.
telemed.org/articles/article.asp?path=legal&article=hip
aaSummary_gw_tie01.xml
Wagealla, W., Carbone, M., English, C., Terzis, S., Lowe,
H. & Nixon, P. (2003, September). A formal model for
trust lifecycle management. In Proceedings of the 1st
International Workshop on Formal Aspects of Security
and Trust, FAST 2003, Pisa, Italy, (pp. 181-192). Retrieved
July 20, 2007, from http://www.iit.cnr.it/FAST2003/fastproc-final.pdf
Wahab, M. (2004). Globalisation and ODR: dynamics
of change in e-commerce dispute settlement. International Journal of Law and Information Technology,
12(1), 123-152.
Wajcman, J. (1991). Feminism confronts technology. University Park, PA: Pennsylvania State University Press.
Walczuch, R., Seelen, J., & Lundgren, H. (2001, September). Psychological determinants for consumer trust in
e-retailing. Paper presented at the Eight Research Symposium on Emerging Electronic Markets (RSEEM01),
Maastricht, The Netherlands.
Walden, I. (2001). Regulating e-commerce: Europe in
the global e-conomy. European Law Review, 26(6),
529-547.
Waldman, M., Rubin, A., & Cranor, L. F. (2000). Publius:
a robust, tamper-evident, censorship-resistant and sourceanonymous web publishing system. In Proceedings of
the 9th UsenixSecurity Symposium (pp. 5972).
Wall, D. S. (2001). Cybercrimes and the Internet. In
D.S. Wall (Ed.), Crime and the Internet (pp. 1-17). New
York: Routledge.
Wallis, D. (1999) Report on the proposal for a council
regulation on jurisdiction and the recognition and en-



Compilation of References

forcement of judgements in civil and commercial matters.


Committee on Legal Affairs in the Internal Market. COM
(1999) 348 - C5-0169/1999 - 1999/0154 (CNS).
Walton, D. (2000). Hackers tough to prosecute, FBI says.
The Globe and Mail, February 10, B5.
Walton, R. (2006). Balancing the insider and outsider
threat. Computer Fraud and Security 2006(11), 8-11.
Wang, H., Lee, M. K. O., & Wang, C. (1998). Consumer
privacy concerns about internet marketing. Association
for Computing Machinery. Communications of the ACM,
41(3), 63-68.
Wang, H., Lee, M., & Wang, C. (1998). Consumer privacy
concerns about internet marketing. Communications of
the ACM, 41(3), 63-70.
Wang, H., Lee, M., & Wang, C. (1998, March). Consumer
privacy concerns about internet marketing. CACM 41(3),
63-70.
Wang, P., & Petrison, L. A. (1993). Direct marketing
activities and personal privacy. Journal of Direct Marketing, 7(Winter), 7-19.
Warren, S., & Brandeis, L. (1890). The right to privacy.
Harvard Law Review, 4(5), 193-220.
Webex.(2006). On-demand vs. On-premise instant messaging. Webex Communications, Ease of CommunicationsOn Demand EIM Solutions. Retrieved October
2007, from http://www.webbuyersguide.com/bguide/
Whitepaper/WpDetails.asp?wpId=Nzc4MQ&hidresty
peid=1&category=
Webster, J. G., & Lin, S. (2002). The internet audience:
Web use as mass behavior. Journal of Broadcasting &
Electronic Media, 46, 1-12.
WebTrends. (2007). Retrieved December 9, 2007, from
http://www.webtrends.com/
Wei, R., & Lo, V. H. (2006). Staying connected while
on the move: Cell phone use and social connectedness.
New Media & Society, 8(1), 53-72.
Weible, R. J. (1993). Privacy and data: An empirical
study of the influence and types and data and situ-



ational context upon privacy perceptions. (Doctoral


Dissertation, Department of Business Administration,
Mississippi State University). Dissertation Abstracts
International, 54(06).
Welcome to the Safe Harbor. (2007). Trade Information
Center. Retrieved June 5, 2007, from http://www.export.
gov/safeharbor/
Westin, A. (1967). Privacy and freedom. New York:
Atheneum.
Westin, M. (1996). The Minnesota government data
practices act: A practitioners guide and observations
on access to government information. William Mitchell
Law Review, 22, 839-902.
White, R. C. A., & Ovey, C. (2006). Jacobs and White:
The European convention on human rights (4th ed.).
Oxford, UK: Oxford University Press.
Wikipedia (2007). Retrieved October 1, 2007, from http://
en.wikipedia.org/wiki/Public_key_infrastructure
Williams, D. (2004, 10 March). Maximising the benefits of
the information economy. Retrieved August 2, 2005, from
http://www.agimo.gov.au/media/2004/03/21377.html
Williams, D. (2004, 27 Feb). Business guides to combat
spam. Retrieved August 2, 2005, from http://www.agimo.
gov.au/media/2004/02/12070.html
Wilson, T. (2007, November 12). ID thief admits using botnets to steal data. Retrieved November 2007,
from http://www.darkreading.com/document.asp?doc_
id=138856
Winn, J. K., & Wright, B. (2001). The law of electronic
commerce. Aspen Publishers.
Wolf, C. (2004). Californias new online privacy policy
law has nationwide implications. Journal of Internet
Law, 7(7), 3-8.
Word of Mouth Marketing Association. (2006, 13 Nov).
Dell makes public commitment to word of mouth ethics.
Retrieved January 5, 2007, from http://www.womma.
org/womnibus/007895.php

Compilation of References

Working Group on Electronic Commerce and Consumers


(Canada). (1999). Principles of consumer protection for
electronic commercea Canadian framework. Ottawa:
Canada Bankers Association.

Youn, S. (2005). Teenagers perceptions of online


privacy and coping behaviors: A risk-benefit appraisal
approach. Journal of Broadcasting & Electronic Media,
49(1), 86-110.

Working Group on Electronic Commerce and Consumers (Canada). (2004). Canadian code of practice for
consumer protection in electronic commerce. Ottawa:
Office of Consumer Affairs, Industry Canada.

Young, J. F. (1971). Information theory. New York: Wiley


Interscience.

World Wide Web Consortium (W3C). (2006). The platform for privacy preferences 1.1 (P3P1.1) specification.
World Wide Web Consortium. Retrieved July 7, 2007,
from http://www.w3.org/TR/P3P11/
World Wide Web Consortium (W3C). (2007). The
platform for privacy preferences (P3P) project. World
Wide Web Consortium. Retrieved July 5, 2007, from
http://www.w3.org/P3P/
Xie, E., Teo, H-H, & Wan, W. (2006). Volunteering
personal information on the internet: Effects of reputation, privacy notices, and rewards on online consumer
behavior. Marketing Letters, 17, 61-74.
Yamazaki, F. (2005). Privacy protection law. Nikkei
BP.
Yank, G. C. (2004 December 21). Canning spam:
Consumer protection or a lid on free speech? Retrieved
October 2007 from http://www.law.duke.edu/journals/
dltr/articles/2004dltr0016.html
Yao, M. Z. (2005). Predicting the adoption of self-protections of online privacy: A test of an expanded theory
of planned behavior model. Unpublished dissertation,
University of California, Santa Barbara. Ackerman, M.
S., & Lorrie F. C. (1999). Privacy critics: UI components
to safeguard users privacy. In Proceedings of the ACM
Conference on Human Factors in Computing Systems
(CHI99) (pp. 258-259). Pittsburgh, PA.
Yao, M., Rice, R., & Wallis, K. (2007). Predicting user
concerns about online privacy. Journal of the American
Society for Information Science and Technology, 58(5),
710-722.
Yianakos, C. (2002). Nameless in cyberspaceprotecting online privacy. B+FS, 116(6), 48-49.

Young, K. S. (1996). Psychology of computer use: XL.


Addictive use of the Internet: A case that breaks the
stereotype. Psychological Reports, 79(3), 899-902.
Young, K. S. (1998). Internet addiction: the emergence
of a new clinical disorder. CyberPsychology & Behavior,
1(3), 237-244.
Yousafzai, S., Pallister, J., & Foxall, G. (2003). A proposed
model of e-trust for electronic banking. Technovation,
23(11), 847-860.
Yu, B., & Singh, M. P. (2002). Distributed reputation
management for electronic commerce. Computational
Intelligence, 18(4), 535-549.
Yu, B., & Singh, M. P. (2002). An evidential model of
distributed reputation management. In Proceedings of
the First International Joint Conference on Autonomous
Agents and Multiagent Systems, AAMAS 02, Bologna,
Italy, (pp. 294301). New York: ACM Press.
Yu, E., & Cysneiros, L. (2002). Designing for privacy
and other competing requirements. In Proceedings of
the 2nd Symposium on Requirements Engineering for
Information Security (SREIS-02).
Yu, T., Winslett, M., & Seamons, K. E. (2003). Supporting structured credentials and sensitive policies through
interoperable strategies for automated trust negotiation.
ACM Transactions on Information and System Security,
6(1), 1-42.
Zacharia, G., & Maes, P. (2000). Trust management
through reputation mechanisms. Applied Artificial Intelligence, 14, 881-907.
Zadok, E., Badulescu, I., & Shender, A. (1998). Cryptfs:
A stackable vnode level encryption file system (Technical
Report CUCS-021-98). Columbia University: Computer
Science Department.



Compilation of References

Zekios, Dr., G. I. (2007). State cyberspace and personal


cyberspace jurisdiction. International Journal of Law
and Information Technology, 15, 1-37.
Zhang, H. (2005). Trust-promoting seals in electronic
markets: impact on online shopping. Journal of Information Technology Theory and Application, 6(4), 29.
Zhang, X. (2005). What do consumers really know?
Communications of the ACM, 48(8), 44-48.
Zhong, Y. (2005). Formalization of trust. Ph.D. thesis,
West Lafayette, IN: Purdue University.
Zhong, Y., & Bhargava, B. (2004, September). Using
entropy to trade privacy for trust. In Proceedings of
the NSF/NSA/AFRL Conference on Secure Knowledge
Management, SKM 2004, Amherst, NY (6 pages).

8

Zhong, Y., Lu, Y., & Bhargava, B. (2004). TERA: An


authorization framework based on uncertain evidence
and dynamic trust (Tech. Rep. No. CSD-TR 04-009).
West Lafayette, IN: Purdue University.
Zhong, Y., Lu, Y., Bhargava, B., & Lilien, L. (2006).
A computational dynamic trust model for user authorization (Working Paper). West Lafayette, IN: Purdue
University.
Zimmermann, P. R. (1995). The official pgp users guide.,
Cambridge MA: The MIT Press.
Zureik, E. (2004) Governance, security and technology:
The case of biometrics. Studies in Political Economy,
73, 113.

9

About the Contributors

Kuanchin Chen is an associate professor of computer information systems at Western Michigan


University. He received his DBA in information system from Cleveland State University and his MS in
information systems from University of Colorado. His research interests include electronic business, online
privacy & security, issues in online user environment, Internet technologies, and data mining techniques.
Dr. Chen serves on the editorial advisory boards of several academic journals and business magazines.
He received grants ranging from universities to federal government. He has been promoting online interactivity, privacy, and security through research studies and grants. Dr. Chens research has appeared
in such journals as Information & Management, IEEE Transactions on Systems, Man, and Cybernetics,
Communications of the Association for Information Systems (AIS), IEEE Transactions on Education,
Journal of Computer Information Systems, and many others. He is the director of Web technology of the
Informing Science Institute.
Adam Fadlalla is a professor, Computer and Information Science at Cleveland State University. He
holds an MBA in finance and decision sciences from Miami of Ohio, a MSc in computer science, and a
PhD in computer information systems from the University of Cincinnati. His current research interests
include decision support systems, artificial intelligence applications, knowledge discovery in databases,
information systems security and privacy issues, enterprise integration systems, and medical informatics. In addition, Dr. Fadlalla has developed interest in issues of global and cross-cultural information
technology as a result of being a Fulbright fellow twice. His published research covers a broad spectrum
of information systems issues.
***
Bharat Bhargava is a professor of computer science with a courtesy appointment in the School of
Electrical and Computer Engineering at Purdue. He conducts research in security and privacy issues in
distributed systems. He received best paper award at the IEEE Data Engineering conference. He was
awarded the IEEE Technical Achievement Award for contributions to foundations of adaptability in distributed systems and the IEEE Computer Society Golden Core service recognition. He was inducted in the
Purdues Book of Great Teachers. He serves on seven journal editorial boards and on Technical Achievement Award and Fellow committees of the IEEE Computer Society. He founded of the IEEE Symposium
on Reliable and Distributed Systems, IEEE Conference on Digital Libraries, and the ACM Conference on
Information and Knowledge Management. He is an IEEE and IETE fellow.
Craig Bisset is a former student of the University of Auckland. He graduated with a BA majoring in
geography and political science. At present, he is completing his MBA at National Cheng Kung University
in Taiwan. His current areas of research interest are Chinese consumer behavior and e-commerce.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

About the Contributors

Tom S. Chan is an associate professor at the Information Department, Southern New Hampshire
University at Manchester, New Hampshire, USA. He holds an EdD from Texas Tech University and a
MSCS from the University of Southern California. Prior to SNHU, he was an assistant professor at Marist
College, and as project manager and software designer, specialized in data communication at Citibank.
He has published works in the area of instructional design, distance learning, technology adaptation,
information security, and Web design.
Jengchung V. Chen is assistant professor in telecommunications management at National Cheng Kung
University, Taiwan. He has published articles dealing with privacy issues in journals like International
Journal of Organizational Analysis, Labor Law Journal, and Information, Communication, and Society.
He holds a PhD in communication and information sciences from the University of Hawaii and a masters
in policy and management from SUY-Stony Brook and computer science from Polytechnic University.
Andy Chiou is an alumnus of New York University with a BA in sociology and economics, and is,
as of time of this writing, currently completing his MBA at National Cheng Kung University in Taiwan.
Upon completion of his MBA, Andy plans to continue pursuing a doctorate degree in the United States.
His current areas of interest are general management and cross cultural issues.
Ken Coghill was born in Australia (1944), has been a veterinarian, a public servant and elected to
Wodonga Council, and as a member and speaker of parliament. He joined Monash University in 1996,
where he teaches Governance and Business & Government in masters programs and supervises PhD research students studying diverse aspects of governance. Assoc. Professor Coghill is a co-director, Monash
Governance Research Unit, where he directs and undertakes research on integrated governance, that is,
the dynamic, evolving inter-relationships of the public, corporate and civil society sectors as they affect
the governance of nation-states.
J. Stephanie Collins earned a PhD in management information systems in 1990 from the University
of Wisconsin. She has taught in the field since 1988 and has published papers in various journals, and
presented at conferences. She has published papers on Information Technology Outsourcing and Technology Applications for Economic Development, IT Education, and on technical issues. She has also worked
as an IT consultant, and has developed several systems. Her current research is focused on how the uses
of internet technologies change the environment for business and for education.
G. Scott Erickson is associate professor and chair of the Marketing/Law Department in the School of
Business at Ithaca College, Ithaca, NY. He holds a PhD from Lehigh University and masters degrees from
Thunderbird and SMU. He has published widely on intellectual property, intellectual capital, competitive
intelligence, and a number of other related topics. His book with Helen Rothberg, From Knowledge to
Intelligence, was published by Elsevier in 2005. His consulting work began over 20 years ago with the
Alexander Proudfoot Company and continues today.
Louis K. Falk received his doctorate in mass communication from the University of Southern Mississippi where he graduated with an emphasis in advertising and public relations. Dr. Falk is an associate professor in the Department of English & Communication, University of Texas At Brownsville. His
research interests include the impact of new technologies on marketing, advertising, and public relations.
Dr. Falk has recently been published in a variety of journals to include: The Journal of Website Promotion, Journal of E-Business, and the Journal of Promotion Management. Dr. Falk is also an elected board

0

About the Contributors

member and Webmaster of the International Academy of Business Disciplines. His Web site address is
http://www.louisfalk.org/
Philip Flaherty was awarded a Bachelor of Civil Law (BCL) from NUI, Galway in 2005; he was
awarded a diploma in legal Irish in 2004. He holds a LLM in public law from NUI, Galway. He co-authored
a consultation paper on statute law restatement for the Law Reform Commission in 2007 and conducted
research on the commissions e-conveyancing road map project. He is currently researching the Settled
Land Acts and will produce a consultation paper on this area of law for the commission in 2008. Philip
will join the Dublin law firm McCann FitzGerald in 2008. Philip is a regular contributor to legal conferences and legal journals.
Anil Gurung is an assistant professor of business and information management at Neumann College.
He received his PhD from the University of Texas at Arlington. Current research interests are in the areas
of IT adoption, information security and privacy, ecommerce, and cultural and social aspects of business
computing. He has published in various journals and conference proceedings.
Huong Ha is currently holding the position of deputy course director at TMC Business School, TMC
Academy, Singapore and a PhD candidate at Monash University, Australia. She holds a masters degree
in public policy from the National University of Singapore. She has many book chapters, journal articles,
reviewed conference papers, and encyclopedia articles published. She has been awarded a research grant
by Consumer Affairs Victoria (Australia), a distinguished paper award (Turkey and the USA) and many
international travel grants.
Naoki Hamamoto received his MBA in computer information systems from Western Michigan
University, Kalamazoo, Michigan. He was involved several major research projects during his graduate
study. He is currently an information professional in internal system auditing. Mr. Hamamotos research
interests include global information security management, Web 2.0 social networking and community,
and business process management.
Thomas J. Holt is an assistant professor in the Department of Criminal Justice at the University of
North Carolina at Charlotte. He has a doctorate in criminology and criminal justice from the University
of MissouriSaint Louis. His research focuses on computer crime, cyber crime, and the role that technology and the Internet play in facilitating all manner of crime and deviance. Dr. Holt has authored several
papers on the topics of hacking, cyber crime, and deviance that have appeared in journals such as Deviant
Behavior and the International Journal of Comparative and Applied Criminal Justice. He is also a member
of the editorial board of the International Journal of Cyber Criminology.
Chiung-wen (Julia) Hsu received her PhD in communication from SUNY Buffalo in 2003. She is
now an assistant professor of Department of Radio & Television at National Cheng Chi University, Taiwan. Julias research interests include communication technology, journalism, mass communication, and
Internet research, especially online privacy issues. She is interested in the users behavioral differences
between online and offline worlds, and in different Internet platforms. She developed a situational model
and has conducted several empirical studies. Julias research has been published in journals such as the
Asian Journal of Communication, Telematics & Informatics, Online Information Review, and Cyberpsychology & Behavior.



About the Contributors

Anurag Jain is an assistant professor at the Bertolon School of Business, Salem State College, MA. He
has over 12 years of industry experience including: strategic brand management, financial planning, global
bilateral business promotion, and IT services. His research has appeared in the proceedings for several
leading conferences that include Americas Conference on Information Systems and Decision Sciences
Institute, and Southwest Decision Science Institute, and the Northeast Decision Sciences. His research
interests at present are towards IT capabilities, value, and the management of IT resources; knowledge
management, adaptive and sustainable competitive enterprise; business intelligence and activity monitoring;
and influence of information technology on organizations and society. He holds a PhD from the University
of Texas-Arlington. He holds a master of science degree from The University of Illinois at Urbana-Champaign; a post graduate diploma in business management from Sydenham Institute of Management; and a
bachelor of commerce degree from The University of Bombay.
Thejs Willem Jansen holds a master of science (computer science engineering) from the Technical
University of Denmark. He co-authored his MSc thesis entitled Privacy in Government IT-Systems with
Sren Peen, which developed the model presented in this chapter and resulted in a paper published at the
Sustaining Privacy in Autonomous Collaborative Environments 2007 workshop. Thejs Jansen currently
holds a position as an IT auditor at PricewaterhouseCoopers, where he reviews IT security procedures by
identifying issues, developing criteria, reviewing and documenting client processes and procedures. His
work includes working with Sarbanes-Oxley clients.
Christian Damsgaard Jensen holds a master of science (computer science) from the University of
Copenhagen (Denmark) and a PhD (computer science) from Universit Joseph Fourier (Grenoble, France).
Dr. Jensen currently holds a position as associate professor at the Technical University of Denmark, where
he supervised the MSc thesis project of Thejs Willem Jansen and Sren Peen. Dr. Jensen works in the areas
of system security and distributed systems, with a particular focus on pervasive computing systems. Dr.
Jensen has published more than 50 papers in international peer-reviewed journals, conferences, symposia,
and workshops.
Sean Lancaster is a lecturer with the Department of Decision Sciences and Management Information Systems at Miami University. He teaches undergraduate courses on information systems & business
strategy, Web design, VisualBasic.NET, database design, and e-commerce. Sean earned his MBA from
Miami University in 2002.
Suhong Li is an associate professor of computer information systems at Bryant University. She earned
her PhD from the University of Toledo in 2002. She is a member of Council of Supply Chain Management Professionals, Decision Science Institute, and International Association for Computer Information
Systems. She has published in academic journals including Journal of Operations Management, OMEGA:
the International Journal of Management Science, Decision Support Systems, Journal of Computer Information Systems, Journal of Managerial Issues, and International Journal of Integrated Supply Management. Her research interests include supply chain management, electronic commerce, and adoption and
implementation of IT innovation.
Leszek Liliens research focuses on opportunistic networks or oppnets (a specialized kind of ad hoc
networks); and trust, privacy, and security in open computing systems. He serves on the editorial boards of
the International Journal of Communication Networks and Distributed Systems, The Open Cybernetics &



About the Contributors

Systemics Journal, and Recent Patents on Computer Science. He was the main organizer and chair of the
International Workshop on Specialized Ad Hoc Networks and Systems (SAHNS 2007), held in conjunction with the 27th IEEE International Conference on Distributed Computing Systems (ICDCS 2007). He
is a senior member of the Institute of Electrical and Electronics Engineers (IEEE).
Elizabeth Ann Maharaj is statistician and a senior lecturer in the Department of Econometrics and
Business Statistics at the Caulfield Campus of Monash University. She teaches subjects on business data
analysis, business forecasting, and survey data analysis. Her main research interests are in time series
classification and forecasting. She has presented several papers on time series classification at international
conferences and much of it has been published in international journals. Elizabeth has also been involved
in research projects in climatology, environmental science, labour markets, human mobility, and finance,
and she has also published in journals in some of these fields.
Karin Mika has been associated with the first-year legal writing program at Cleveland-Marshall College of Law since 1988. She has also worked as an adjunct professor of English at Cuyahoga Community
College and is a research consultant for various firms and businesses in the Cleveland area. In addition,
Professor Mika was faculty advisor for the law schools moot court program and currently judges at various national moot court competitions. She has lectured on essay writing technique for several bar review
courses and has written bar exam essay questions for both the California and Minnesota bar examiners. Professor Mikas areas of scholarly research are varied and she has published in the areas of Native
American law, Internet law, and health care. Administrative and teaching responsibilities: legal writing,
research, and advocacy program.
Shahriar Movafaghi received a PhD in computer science from Northwestern University, with over 20
years of hands on technical experience. Dr. Movafaghi has published numerous papers in areas of digital
rights, data warehousing, databases, system architecture, software engineering, object-oriented technology, application development, and teaching techniques for IT. He has architected and led many software
system projects in the financial, apparel, publishing, and computer hardware/software industries as well
as directed government-funded research and development projects. He has taught courses at various universities including SNHU, UNH, BU, and UMASS Lowell.
Charles OMahony is a legal researcher with the Law Reform Commission of Ireland. He was awarded
a BA in 2003 and a LLB in 2004 from NUI, Galway. He holds a masters in law (LLM) from University
College London and a LLM in public law from NUI, Galway. He was the principal legal researcher for
the Commissions Third Programme of Law Reform and authored a report on the Third Programme. He
is currently preparing a consultation paper on reform of the jury system in Ireland, which is due for publication in 2008. He is a senior tutor in law at University College Dublin, where he has tutored courses on
legal systems and methods, tort law, and constitutional frameworks.
Betty J. Parker is associate professor of marketing & advertising at Western Michigan University,
where she teaches undergraduate and graduate courses in Internet marketing, media research, and advertising. She has published papers about the Internet as a communication tool and the public policy aspects
of marketing alcohol, food, and prescription drugs. She holds a PhD from the University of Missouri, an
MBA from the University of Illinois-Chicago, and a BA from Purdue University.



About the Contributors

Andrew Pauxtis is currently a graduate student in the information systems management masters
program at Quinnipiac University in Hamden, CT. His areas of interest in regards to information technology include Web development and design technologies, search engine technologies, search engine
optimization, Internet marketing, and Web 2.0. He graduated from Quinnipiac University in 2007 with a
BA in mass communications.
Sren Peen holds a master of science (computer science engineering) from the Technical University of
Denmark. He co-authored his MSc thesis entitled Privacy in Government IT-Systems with Thejs Jansen,
which developed the model presented in this chapter and resulted in a paper published at the Sustaining
Privacy in Autonomous Collaborative Environments 2007 workshop. Since his graduation, Sren Peen has
completed his national service in the Danish Defence Forces and is currently employed as an IT-specialist
at IBMs Copenhagen Crypto Competence Center.
Alan Rea earned his BA at The Pennsylvania State University, an MA at Youngstown State University,
an MS at the University of Maryland, Baltimore County, and his PhD at Bowling Green State University.
Alan is an associate professor of business information systems at Western Michigan University. Since
1997 Alan has taught courses in Web development and design, system administration, and various objectoriented language programming courses. Alan researches topics in artificial intelligence, hacker culture,
security, and virtual reality. He has published articles in these fields, as well as authored several textbooks
on an assortment of information technology topics.
Bernadette H. Schell, the founding dean of the Faculty of Business and Information Technology at the
University of Ontario Institute of Technology in Canada, has authored four books on the topic of hacking:
The Hacking of America: Whos Doing It, Why, and How (2002); Contemporary World Issues: Cybercrime
(2004); Websters New World Hacker Dictionary (2006); and Contemporary World Issues: The Internet and
Society (2007). She has also written numerous journal articles on topics related to violence in society and
is the author of three books dealing with stress-coping in the workplace (1997), the stress and emotional
dysfunction of corporate leaders (1999), and stalking, harassment, and murder in the workplace (2000).
Angelena M. Secor received her MBA in computer information systems from Western Michigan
University, Kalamazoo, Michigan. She is currently an IT consultant in the healthcare field. Mr. Secors
research interests include information security, information privacy, and offshore outsourcing of healthcare services.
Hy Sockel received his doctorate in management information systems from Cleveland State University.
Dr. Sockel is a visiting professor in the management information systems area at Indiana University South
Bend. His research interests include the impact of technology on the organization and its employees, electronic commerce, database technologies, and application systems. Dr. Sockel has recently been published
in a variety of journals including: Journal of Management Systems, The Journal of Website Promotion,
and the International Journal of Web Engineering and Technology.
J. Michael Tarn is a professor of business information systems at Western Michigan University. He
holds a PhD and an MS in information systems from Virginia Commonwealth University. Dr. Tarn specializes in multidisciplinary research, involving ICT, EC, and strategic management. He has published
various articles in professional journals, book chapters, and refereed conference proceedings. Professor



About the Contributors

Tarn coauthored the first scholarly book in ES education, Enterprise Systems Education in the 21st Century. He is managing editor of International Journal of Management Theory and Practices. His areas of
expertise are information security management, data communications management, Internet research,
international MIS, and critical systems management.
Barbara J. Tyler is a registered nurse and attorney who taught legal writing and drafting for 16 years, as
well as directing the Legal Writing Department for 6 years. She serves as a consultant to practitioners and
firms on medical malpractice claims, risk management, and writing and drafting. She was most rewarded
in her teaching career to serve as faculty-advisor to the Cleveland-Marshall Journal of Law and Health for
6 years, as well as advisor for 4 years to the Delta Theta Phi Law fraternity. Professor Tyler was honored
with the Wilson Stapleton law alumni award for teaching excellence in 2005, as well as the International
Delta Theta Phi law fraternity most outstanding teacher of the year 2005-06. She is active in many legal
and community organizations. Her research and writing interests are varied, and she has published in the
areas of medicine and law, Internet law, insurance law, and art law, as well as learning theory.
Bruce A. White is the chair of information systems management at Quinnipiac University in Hamden,
CT. He is also on the Educational Foundation for the Institute for Certification of Computer Professionals,
on the editorial review board for the Journal of Information Systems Education and the Global Journal of
Information Management. He has chaired the ISECON conference four times. His current research is on
Web 2.0 technologies, assessment processes, health information systems, and outsourcing.
David C. Yen is currently Jennifer J. Petters chair in Asian business and professors of MIS of the Department of Decision Sciences and Management Information Systems at Miami University. He assumed
Raymond E. Glos professor in business from 2005-2007 and was a department chair from 1995-2005. After
receiving his PhD in MIS and MS in computer sciences in 1985, Professor Yen is active in research. He
has published books and articles which have appeared in Communications of the ACM, Decision Support
Systems, Information & Management, Information Sciences, Computer Standards and Interfaces, Information Society, Omega, International Journal of Organizational Computing and Electronic Commerce, and
Communications of AIS, among others. Professor Yens research interests include data communications,
electronic/mobile commerce, and systems analysis and design.
Chen Zhang is an assistant professor of computer information systems at Bryant University. He received
his MS and PhD in computer science from the University of Alabama in 2000 and 2002, and a BS from
Tsinghua University, Beijing, China. Dr. Zhangs primary research interests fall into the areas of computer
networks and distributed systems. He is a professional member of ACM and IEEE.





Index

Adobe.com 7
AdWords 1, 3, 7, 8, 9, 10, 11, 12, 13
aesthetic experience 224, 225
AltaVista 5
Amazon 4, 205, 260
anonymity 19, 30, 65, 83, 86, 100, 117, 118, 217,
235, 356, 371, 394, 397, 398, 409, 416, 418
anonymity set 93, 100, 101, 113
anonymous surfing tools 278
ANSI bombs 40
Asia Pacific Economic Co-Operation (APEC) 129
asset theft 36
attack machine 41
Australian Consumers Association (ACA) 134
Australian Guidelines for Electronic Commerce
(AGEC) 132
automated trust negotiation (ATN) 96
Automatic Processing of Personal Data 333, 340

CartManager 19
casual consumer 2
Childrens Online Privacy and Protection Act 17
Childrens Online Privacy Protection Act (COPPA)
262, 271
civil liberties 307, 308, 309, 406
click streams 181
Coca-Cola 264, 283
code of ethics 137
common biometric exchange file format (CBEFF)
305
common identifiers 69, 74
computer-mediated communication (CMC) 233
computer cache 27
computer underground (CU) 190, 192, 195
confidential business information (CBI) 316
confidentiality 66, 105, 118, 121, 422
confidentiality-integrity-availability (CIA) 88
consumer protection 2, 17, 18, 111, 123, 124, 125,
126, 127, 128, 129, 131, 132, 133, 134, 135,
136, 137, 138, 139, 141, 142, 143, 144, 145,
146, 232, 255, 280, 326, 327, 340, 347, 358,
361, 389, 395, 397, 403, 408, 413, 414, 418,
420, 427
consumer rights 126
content abuse 36
cookie 6, 7, 22, 259, 260, 266, 274, 419
Council of Europes European Convention of Human
Rights 333
CPA Web Trust 271
CPU cycles 100
Crackers 194, 195
credit card data 18, 315
cyber crime 196
Cybercrime Act 42
cyberfly 92, 122
cybermedicine 347, 355, 361, 363, 367, 368, 409,
420

B
banner ads 11, 181, 263
BBBOnline Privacy 134, 269, 276, 277
Better Business Bureau (BBB) 88
BioAPI 305
biometric characteristics 300, 303
black listing 48
blogs 52, 53, 210, 217, 218, 229, 237, 257, 261, 264,
285, 390, 400, 415, 424
Boeing 261, 264, 282
bot 8, 40, 41, 42, 50
bot herder 41, 42
botnet 41, 42, 43
Botnet Task Force 43
Brussels Convention on Jurisdiction and Recognition
of Foreign Judgments 330
business-to-business (B2B) 372
business to customer (B2C) 372

Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.

Index

Cyber Security Enhancement Act 17, 23, 31, 198,


404
cyber stalking 202, 203, 223, 230

D
data-processing 65
data-storage 65, 66
Data Protection Directive 63, 127, 128, 320, 333,
335, 336, 337, 338, 339, 340, 341, 343, 345
denial of service (DoS) 36
Department of Justice 43, 313
diffused audiences 219
digital fingerprint 244, 245, 246
digital rights management (DRM) 240, 241
disposition to trust 94, 95, 170
disruption 36
distributed DoS (DDoS) 36
Do Not Call (DNC) Registry 285, 286

E
e-commerce xvii, 3, 51, 81, 86, 95, 112, 114, 117,
120, 142, 143, 144, 145, 146, 148, 151, 154,
155, 160, 161, 162, 163, 168, 170, 171, 173,
175, 177, 178, 182, 185, 186, 187, 188, 189,
191, 195, 196, 216, 241, 243, 249, 251, 252,
263, 266, 270, 271, 272, 273, 277, 279, 280,
301, 326, 327, 328, 329, 331, 332, 333, 337,
339, 340, 342, 344, 372, 373, 384, 385, 386,
391, 397, 398, 403, 405, 406, 408, 411, 412,
413, 415, 418, 419, 420, 422, 424, 425
e-retailer 136
Electronic Commerce Directive 329
Electronic Privacy Information Center (EPIC) 84,
167, 183, 188, 398
electronic risk management 38
encryption system 247
end user 47
Entertainment Software Rating Board (ESRB) 271
European Court of Human Rights 334, 345
European extra-judicial network (EEJ-Net) 328
European Parliament 55, 61, 63, 66, 80, 82, 84, 399
European Union (EU) 128, 149, 326, 327
Excite 5
exposure 38

F
Facebook 181, 182, 183, 184, 187, 293, 294, 393,
399, 409, 422
Fair and Accurate Credit Transactions 22, 23
falsification of private information 27

FBI 43, 45, 46, 53, 194, 209, 210, 211, 394, 402, 426
Federal Information Security Management Act 23,
25
Federal Trade Commission (FTC) 4, 17, 249, 258,
270, 286
file-sharing 41
firewalls 46
flooding 197
focus 38, 292
Fortune 100 207, 269, 271, 273, 274, 275, 276, 277,
282
Freedom of Information Act (FOIA 311
FTC v. Petco 18, 25

G
gender-neutral IDs 28
Gmail 1, 8, 9, 13, 264
Google 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 9, 8, 9, 10, 11, 12,
13, 14, 15, 21, 39, 49, 54, 166, 180, 181, 183,
185, 191, 264, 314, 366, 392, 393, 395, 398,
402, 404, 408, 411, 412, 414, 417, 423, 424,
425
Google Calendar 3
Google Checkout 3, 11
Google Desktop 3, 12
Google Earth 3, 7
Google Groups 3
Google Maps 3
Google News 3
Google Notebook 3
Google Talk 3
gray listing 48

H
half-life 38
Health Insurance Portability and Accountability Act
(HIPPA) 17, 20, 24, 31, 86, 207, 253, 271,
307, 348, 352, 353, 366, 368, 404
Homeland Security Act 17, 23, 31, 198, 404
House of Representatives 17
hybrid listing 48

I
identity theft 18, 19, 307
Identity Theft and Assumption Deterrence Act 307
IIA Content Regulation Code of Practice 133
informational privacy 90, 322, 391
information technology (IT) 34, 371
Institute of International Management (IIM) 292



Index

integrity 24, 25, 45, 47, 51, 87, 88, 95, 103, 104,
105, 114, 154, 232, 242, 258
intellectual property rights (IPR) 196, 197, 207
Internet cafs 34
Internet itinerary 124
Internet protocol (IP) 274
Internet relay chat (IRC) 42
Internet service providers (ISPs) 285
IP-based black list technique 42
IP delivery 6
iTunes 166, 168, 182, 185, 389, 391, 410

obfuscating 94
Olmstead 314
online privacy policy (OPP) 271
online privacy protection behavior xvii, 152, 158,
160, 161
online trust 208
Operation Bot Roast 43
opt-out system 378
Organisation for Economic Co-Operation and Development (OECD) 128
outsourcing 69, 73

Japans Act on the Protection of Personal Information


370, 371, 372, 374

PageRank feature 6
patient rights 360
PayPal 43, 99
PC Magazine 6, 9, 13, 135, 144, 145, 411, 416
persistence 38
personal computers (PCs) 191
personal digital assistants (PDAs) 33
personal privacy xv, xviii, 1, 3, 4, 57, 81, 90, 165,
167, 171, 187, 222, 242, 294, 295, 297, 313,
318, 319, 321, 423, 426
phishing 21, 132, 147, 212, 213
phreaking 125, 197
Platform for Privacy Preferences (P3P) 134, 372
prevalence 38
Privacy Act 1988 130, 131, 132, 147
privacy enhancing technology (PET) 58
privacy evaluation 67
privacy protection behavior xvii, 152, 156, 158, 159,
160, 161
Privacy Rights Clearinghouse 38, 54, 59, 83, 362,
368, 369, 417
pseudo-voluntary data dissemination 90
public key infrastructure (PKI) 246

K
keyloggers 42

L
LAN 53, 88, 415
legal privacy controls 105
limiting data 69, 74
Lycos 5

M
malware 18, 21, 30, 39, 40, 140, 390
McDonalds 261
media drenching gratification 230
Medicaid 351, 353
Medicare 351, 353
Michinoku Bank 382, 383, 384
MiniStore 166, 185, 410
MIT.edu 7
mobile encryption 44
multiple identities 28, 158, 175, 181
MX hosts 19

N
NASA.gov 7
National Principles for the Fair Handling of Personal
Information (NPPs) 130
non-governmental organisations (NGOs) xvi, 58
NSF.gov 7
NSW Office of Fair Trading 126, 144, 414

O
ODonnell v. Blue Cross Blue Shield of Wyo. 354,
367, 414
8

Q
questionnaires 81

R
Rbot 42
Real.com 7
Record Industry Association of America (RIAA) 285
remote process control (RPC) 41
repudiation 36
reputation deviation 95

Index

Safe Harbor 105, 119, 138, 269, 276, 277, 426


Safe Internet Pharmacy Act 368, 423
search engine logging 7
search engine optimized (SEO) 5
security administrator tool for analyzing networks
(SATAN) 208
sensitivity 68, 72, 78, 79
situational paradigm 214, 216
smartphones 33
social engineering 36
social paradigm of trust 114
sociograms 95
Soft Bank, Co. 381
SPAM 18, 21, 22, 23, 31, 193, 257, 264, 285, 286,
287, 289, 296, 392, 423
Spam Act 2003 130, 131, 132, 147
spoofing 197
Spyware 18, 21, 22, 23, 24, 31, 40, 132, 141, 147,
260, 266, 286, 287, 297, 394, 401, 405, 408
SQL injection attacks 18, 25, 27
State Farm 276, 282
steganography 105, 113, 244, 245, 246
Storm 42
structural equation models (SEM) 233
surveillance 39, 69, 72, 78, 79, 242
syndication 248

U.K. Scotland Yard Computer Crime Research Center 45


U.S. Can-Spam Act 132
U.S. Court of Appeals 288
U.S. Safe Web Act 17, 23
underage consumer protection 18, 20
United Nations (UN) 128, 149
unlinkability 65
unobservability 65, 83, 394, 397, 416
USA Patriot Act 17, 23, 31, 425

tamper-proofing 94, 116, 395


Television without Frontiers Directive 330
territorial privacy 90
terrorism 34, 84, 211, 212, 291, 295, 296, 308, 345,
423
The Electronic Signatures Directive 331
Theory of Human Identification 302
transparency 70, 73, 78, 79, 80
trust 13, 27, 29, 30, 32, 49, 59, 71, 82, 85, 86, 87,
88, 89, 90, 92, 93, 94, 95, 96, 97, 98, 99,
100195, 202210, 234, 237, 265, 266, 267,
268, 270294, 301, 327, 351, 372398,
400428
trust-enhanced role assignment (TERA) 110
TRUSTe 130, 134, 181, 187, 269, 271, 276, 277,
359, 385, 391, 424
trustee 87, 89, 94, 95
truster 87, 89, 94, 95
trust gains 93

XML scripting language 167

V
verifying trust 97, 98
vulnerability management 50

W
Walt Disney 261, 282
watermarking 86, 94, 244, 245, 253, 396
Web bugs 177, 181, 192, 260, 274
WebTrends 166, 167, 187, 426
white listing 48
WiFi 88
Wired Magazine 4, 11
World Wide Web Consortium (W3C) 167, 187, 249,
372, 427

Y
Yahoo! 3, 5, 21, 381, 382, 383, 384, 385, 422
YouTube 3, 9, 10, 11

Z
zombie machine 41

9

You might also like