You are on page 1of 4

Ellen S.

Miller
Co-Founder and Executive Director
The Sunlight Foundation
1818 N Street, NW
Suite 300
Washington, DC 20036
December 28, 2009

Comment on E9-25757 -- Improving Implementation of the Paperwork Reduction Act

To the Office of Information and Regulatory Affairs:

The Sunlight Foundation welcomes the opportunity to comment on improving the Paperwork
Reduction Act.

The Sunlight Foundation was founded in 2006 with the non-partisan mission of using the
power of the Internet to make information about the federal government more meaningfully
accessible to citizens. Through our projects and grant-making, Sunlight catalyzes political
transparency and fosters openness and accountability in government. Our comments focus
on the intersection between the Paperwork Reduction Act and the government's presence on
the Internet.

The Paperwork Reduction Act has multiple purposes, four of which are particularly applicable
here:

• "Minimize the paperwork burden . . . resulting from the collection of information by


or for the Federal Government";

• "Ensure the greatest possible public benefit from and maximize the utility of
information . . . collected. . . and disseminated by or for the Federal Government";

• "Provide for the dissemination of public information on a timely basis, on equitable


terms, and in a manner that promotes the utility of the information to the public and
makes effective use of information technology"; and

• "Improve the quality and use of Federal information to strengthen decisionmaking,


accountability, and openness in Government and society".

See 44 U.S.C. 3501.

Government agencies have traditionally interpreted the Paperwork Reduction Act in ways
that subvert these purposes when it comes to the government's presence online.
Specifically, government websites do not implement user surveys without first receiving
approval from OIRA in a process that is lengthy and laborious. See 5 C.F.R. 1320 et seq.
Moreover, agencies interpret the term "survey" broadly, banning or restricting tools that
allow users to publicly rank or assess the usefulness of information. Finally, the authors of
the Paperwork Reduction Act never considered the ways that the Internet would allow
citizens to directly communicate with one another on government websites through the use
of voluntary surveys.

Nature of the Information Collection


We must start by distinguishing among information collections that are voluntary, voluntary
to receive a benefit, and mandatory, particularly in the online context. Voluntary "surveys"
on government websites are easily distinguishable in that they impose either no or a de
minimis burden on citizens because of their very nature. In addition, online surveys allow
citizens to communicate with one another in ways never envisioned by the authors of the
Paperwork Reduction Act, thereby transforming any theoretical burden into a benefit for
citizens.

Although there are strong arguments that mandatory surveys or surveys required to receive
a benefit, when used in an online context, should be subject to fewer restrictions than their
paper equivalents, we focus our comments on voluntary online survey methods. We begin
with an examination of the private sector.

Private Sector Voluntary Survey Methods

The private sector employs many voluntary survey methods to improve the user friendliness
of their websites. These survey methods include "up or down" votes, star rating systems,
tags, prose comments, traditional surveys, and website analytics.

In the simplest instance, websites allow users indicate their interest in an item by clicking
on a button to vote for it. See, e.g., the news aggregator Digg.com. Information conveyed
on Digg.com is dynamically organized by the items that have received the most votes or
"diggs." Similarly, video content provider YouTube.com allows users to indicate their
approval or disapproval of comments on videos by voting the item up or down. Thus,
information that is deemed to be most useful is also the most easily accessible.

A slightly more sophisticated rating system has been implemented by the retailer
Amazon.com, which utilizes a rating system of 1 to 5 stars. Users select the stars in
reviewing a product, with 5 stars indicating a high level of satisfaction. Over time, this
reveals whether there is a consensus opinion on the utility of a product. YouTube.com users
also rate videos on a 5 star scale. YouTube videos may be sorted based upon the number of
votes and ratings, allowing users to find what likely will be most interesting to them.

Another common approach is the labeling or "tagging" of items. Tags are often employed
either to bring items to the website administrator's attention or to create a dynamic index.
Craigslist.org, a retail intermediary, allows users to tag posts as "miscategorized,"
"prohibited," "spam/overpost," or "best of craigslist." This is a signal to administrators that
they need to take further action, thereby allowing the administrators to focus their attention
where it is most useful. By contrast, the photo sharing website Flickr.com allows users to
tag photos with any text they choose. For example, a picture of roses may be tagged by a
user with the terms flowers, roses, red, Valentine's Day, etc. Flickr website users may then
search using the tagged terms. This helps optimize the website's search engine, leading to
more user-friendly results.

Many websites solicit user feedback in the form of prose comments. Blogs, such as the
Sunlight Foundation's blog.sunlightfoundation.com, encourage reader feedback.
Amazon.com encourages users to review books, and makes those reviews available to
others. The aggregation of all of these reviews provide extraordinarily helpful signals to
other users of these sites, as well as to their owners.

Finally, a number of websites use a traditional voluntary survey to solicit user feedback.
Among all of the tools discussed above, this seems to be the least common method, and
only is applied in specific contexts. Instead, websites rely on tools such as Google Analytics
(http://www.google.com/analytics/) to gain an understanding of visitor behavior. Google
Analytics tracks clicks on hyperlinks. It indicates the pages users visit most frequently,
which hyperlinks users click, and so on. With suitable safeguards, website analytics can help
website administrators understand user behavior without infringing upon individual privacy.
It can lead to better website design.

All of the rating tools engage users in the common task of improving the websites they
commonly use. The user experience is significantly improved without imposing any burden
at issue in the Paperwork Reduction Act. Moreover, each of the tools, when properly used by
the government, would significantly improve user experiences on these sites and provide
valuable feedback to the administrators of the sites.

Recommendations

Voluntary government website surveys should not fall under the Paperwork Reduction Act.
OIRA should issue clear guidance to this effect. To the extent that the use of voluntary
online government surveys should be centrally coordinated, OMB retains the power to fulfill
that function.

If, however, the Paperwork Reduction Act is interpreted to apply to online voluntary
surveys, the Act's role should be strictly limited. We recommend:

• Expedited approval authority. A government website administrator's request to


implement an online survey and subsequent approval for survey language should
take no more than 15 days from start to finish in total, not the more than 180 often
required.

• Delegated approval authority. More agencies should be delegated the authority


to approve surveys under Appendix A to 5 C.F.R. 1320.

• Concurrent comment period with surveys. Instead of requiring public comment


as to whether a voluntary survey is appropriate prior to a survey being used online,
public comment can take place concurrently with a survey being available online.
Comments on the survey could also be included in the survey itself.

• Pilot programs. Agencies should be permitted to engage in pilot programs to try


new voluntary survey techniques.

• Pre-approved voting/rating surveys. Commonly utilized surveys that use


voting/rating systems should not require additional approval by OIRA. Instead, they
should be pre-approved for use. For example, the use of "voting up or down" or a 5
star rating system should be pre-approved for use by all agencies and not subject to
additional approval for use in specific cases.

• Pre-approved questionnaire surveys. There should be pre-approved surveys


that use the questionnaire format. For each survey, multiple questions can be pre-
approved as well, with the agency having the final say as to which questions to use.

• Website analytics. Website analytics that protect users' identities should be pre-
approved for use, and not require re-approval when used in specific instances.

• OMB Policy statements. Agencies are often confused by what is and is not
permissible under the Paperwork Reduction Act. OMB should issue clear guidance
explaining the extent to which the Paperwork Reduction Act limits the use of
surveys, and identify information collection techniques that meet with the
requirements of the Act. Moreover, it should revisit that guidance at technology
changes.

Conclusion

OIRA deserves praise for reexamining the Paperwork Reduction Act. We strongly urge that
the Act be found inapplicable to voluntary surveys on government websites, with OMB
issuing a directive to that effect. However, to the extent the Act is found to apply in these
instances, we believe that its application should be understood in light of the imperative of
ensuring "the greatest possible public benefit from and maximiz[ing] the utility of
information . . . collected. . . and disseminated by or for the Federal Government."

We appreciate the opportunity to comment on these issues and welcome the opportunity to
engage in further dialog. Please do not hesitate to contact Sunlight's Policy Counsel, Daniel
Schuman, at dschuman@sunlightfoundation.com or 202-742-1520, with any questions or
comments.

Sincerely yours,

Ellen S. Miller

You might also like