Technical filtering plays a minor role in this regulation.The first wave of regulatory actions in the 1990s in the United States came about in response to the profusion of sexually explicit material on the Internet within easy reach of minors.A school or library seeking to receive or retain federal funds for Internet access must certify to the FCC that it has installed or will install technology that filters or blocks material deemed to be obscene, child pornography, or material ‘‘harmful to minors.’’16 The Supreme Court rejected First Amendment challenges to CIPA, holding that speakers had no right of access to libraries and that patrons could request unblocking.17 In response, some libraries and schools have rejected E-Rate funding,18 but most have felt financially compelled to install the filters.In the aftermath of CDA, COPA, and CIPA, Internet filtering in the United States is carried out largely by private manufacturers.Since that time, several legislative attempts at creating a mandatory system of content controls in the United States have failed to produce a comprehensive solution for those pushing for tighter controls.At the same time, the legislative attempts to control the distribution of socially objectionable material on the Internet in the United States have given rise to a robust system that limits liability over content for Internet intermediaries such as Internet service providers (ISPs) and content hosting companies.Public dialogue, legislative debate, and judicial review have produced filtering strategies in the United States and Canada that are different from those described elsewhere in this volume.In the United States, many government-mandated attempts to regulate content have been barred on First Amendment grounds, often after lengthy legal battles.1 However, the United States government has been able to exert pressure indirectly where it cannot directly censor.
ACLU by enacting the Child Online Protection Act (COPA) —a second attempt at speaker-based content regulation. The district court enjoined COPA on First Amendment grounds.15 After a few trips to the Supreme Court and back for fact-finding, the district court issued its ruling in March 2007, finding COPA void for vagueness and not narrowly tailored to the government’s interest in protecting minors. Circuit Court of Appeals later affirmed this decision, and, in January 2009, the Supreme Court put the legislation to rest—at least for now—by refusing to hear the case.
Signed into law by President Bill Clinton in February 1996, the CDA was designed to criminalize the transmission of ‘‘indecent’’ material to persons under 18 and the display to minors of ‘‘patently offensive’’ content and communications.5 The CDA took aim not only at the authors of ‘‘indecent’’ material but also at their Internet service providers, although it offered them each safe harbor if they imposed technical barriers to minors’ access.6 Prior to taking effect, the CDA was challenged in federal court by a group of civil liberties and public interest organizations and publishers who argued their speech would be chilled by fear of the CDA’s enforcement.
The three-judge district court panel concluded that the terms ‘‘indecent’’ and ‘‘patently offensive’’ were sufficiently vague such that enforcement of either prohibition would violate the First Amendment.7 ‘‘As the most participatory form of mass speech yet developed,’’ Judge Stewart Dalzell wrote in a concurring opinion, ‘‘the Internet deserves the highest protection from governmental intrusion.’’8 The U. Supreme Court affirmed this holding in 1997, invalidating the CDA’s ‘‘indecency’’ and ‘‘patently offensive’’ content prohibitions.9 In the landmark case Reno v.
Although CIPA mandates the presence of filtering technology in schools and libraries receiving subsidized Internet access, it effectively delegates blocking discretion to the developers and operators of that technology.
The criteria ‘‘obscene,’’ ‘‘child pornography,’’ and ‘‘harmful to minors’’ are defined by CIPA and other existing legislation, but strict adherence to these rather vague legal definitions is beyond the capacity of filters and inherently subject to the normative and technological choices made during the software design process.