About
Community
Bad Ideas
Drugs
Ego
Erotica
Fringe
Society
Technology
Hack
Phreak
Broadcast Technology
Computer Technology
Cryptography
Science & Technology
Space, Astronomy, NASA
Telecommunications
The Internet: Technology of Freedom
Viruses
register | bbs | search | rss | faq | about
meet up | add to del.icio.us | digg it

Censorship in a Box: Why Blocking Software is Wrong

by ACLU

Executive Summary

The Internet is rapidly becoming an essential tool for learning and communication. But the dream of universal Internet access will remain only a dream if politicians force libraries and other institutions to use blocking software whenever patrons log on.

This special report by the American Civil Liberties Union provides an in depth look at why mandatory blocking software is both inappropriate and unconstitutional in libraries. We do not evaluate any particular product, but rather seek to demonstrate how all blocking software censors valuable speech and gives librarians, educators and parents a false sense of security when providing minors with Internet access.

Our report follows up an August 1997 ACLU special report, Fahrenheit 451.2: Is Cyberspace Burning? in which we warned that government coerced, industry efforts to rate content on the Internet could torch free speech online. In that report, we offered a set of guidelines for Internet Service Providers and other industry groups contemplating ratings schemes.

Similarly, in Censorship in a Box, we offer a set of guidelines for libraries and schools looking for alternatives to clumsy and ineffective blocking software:

Acceptable Use Policies. Schools should develop carefully worded policies that provide instructions for parents, teachers, students, librarians and patrons on use of the Internet.

Time Limits. Instead of blocking, schools and libraries can establish content-neutral time limits as to when and how young people should use the Internet. Schools can also request that Internet access be limited to school-related work.

"Driver's Ed" for Internet Users. Students should be taught to engage critical thinking skills when using the Internet and to be careful about relying on inaccurate resources online. One way to teach these skills in schools is to condition Internet access for minors on successful completion of a seminar similar to a driver's education course. Such seminars could also emphasize the dangers of disclosing personally identifiable information and communicating about intimate matters with strangers.

Recommended Reading. Libraries and schools should publicize and provide links to websites that have been recommended for children and teens.

Privacy Screens. To avoid unwanted viewing of websites by passers-by -- and to protect users' privacy when viewing sensitive information -- libraries and schools should place privacy screens around Internet access terminals in a way that minimizes public view Taken together, these approaches work much better than restrictive software that teaches no critical thinking skills and works only when students are using school or library computers.

Like any technology, blocking software can be used for constructive or destructive purposes. In the hands of parents and others who voluntarily use it, it is a tool that can be somewhat useful in blocking access to some inappropriate material online. But in the hands of government, blocking software is nothing more than censorship in a box.

Introduction

In libraries and schools across the nation, the Internet is rapidly becoming an essential tool for learning and communication. According to the American Library Association, of the nearly 9,000 public libraries in America, 60.4 percent offer Internet access to the public, up from 27.8 percent in 1996. And a recent survey of 1,400 teachers revealed that almost half use the Internet as a teaching tool. But today, unfettered access to the Internet is being threatened by the proliferation of blocking software in libraries.

America's libraries have always been a great equalizer, providing books and other information resources to help people of all ages and backgrounds live, learn, work and govern in a democratic society. Today more than ever, our nation's libraries are vibrant multi-cultural institutions that connect people in the smallest and most remote communities with global information resources.

In 1995, the National Telecommunications and Information Administration of the U.S. Department of Commerce concluded that "public libraries can play a vital role in assuring that advanced information services are universally available to all segments of the American population on an equitable basis. Just as libraries traditionally made available the marvels and imagination of the human mind to all, libraries of the future are planning to allow everyone to participate in the electronic renaissance."

Today, the dream of universal access will remain only a dream if politicians force libraries and other institutions to use blocking software whenever patrons access the Internet. Blocking software prevents users from accessing a wide range of valuable information, including such topics as art, literature, women's health, politics, religion and free speech. Without free and unfettered access to the Internet, this exciting new medium could become, for many Americans, little more than a souped-up, G-rated television network.

This special report by the American Civil Liberties Union provides an in depth look at why mandatory blocking software is both inappropriate and unconstitutional in libraries. We do not offer an opinion about any particular blocking product, but we will demonstrate how all blocking software censors valuable speech and gives librarians, educators and parents a false sense of security when providing minors with Internet access.

Like any technology, blocking software can be used for constructive or destructive purposes. In the hands of parents and others who voluntarily use it, it is a tool that can be somewhat useful in blocking access to some inappropriate material online. But in the hands of government, blocking software is nothing more than censorship in a box.

The ACLU believes that government has a necessary role to play in promoting universal Internet access. But that role should focus on expanding, not restricting, access to online speech.

Reno v ACLU: A Momentous Decision

Our vision of an uncensored Internet was clearly shared by the U.S. Supreme Court when it struck down the 1996 Communications Decency Act (CDA), a federal law that outlawed "indecent" communications online.

Ruling unanimously in Reno v. ACLU, the Court declared the Internet to be a free speech zone, deserving of at least as much First Amendment protection as that afforded to books, newspapers and magazines. The government, the Court said, can no more restrict a person's access to words or images on the Internet than it could be allowed to snatch a book out of a reader's hands in the library, or cover over a statue of a nude in a museum.

The nine Justices were clearly persuaded by the unique nature of the medium itself, citing with approval the lower federal court's conclusion that the Internet is "the most participatory form of mass speech yet developed," entitled to "the highest protection from governmental intrusion." The Internet, the Court concluded, is like "a vast library including millions of readily available and indexed publications," the content of which "is as diverse as human thought."

Blocking Software: For Parents, Not the Government

In striking down the CDA on constitutional grounds, the Supreme Court emphasized that if a statute burdens adult speech--as any censorship law must-- it "is unacceptable if less restrictive alternatives were available."

Commenting on the availability of user-based blocking software as a possible alternative, the Court concluded that the use of such software was appropriate for parents. Blocking software, the Court wrote, is a "reasonably effective method by which parents can prevent their children from accessing material which the parents believe is inappropriate." [Emphasis in the original]

The rest of the Court's decision firmly holds that government censorship of the Internet violates the First Amendment, and that holding applies to government use of blocking software just as it applied when the Court struck down the CDA's criminal ban.

In the months since that ruling, the blocking software market has experienced explosive growth, as parents exercise their prerogative to guide their childrens' Internet experience. According to analysts at International Data Corporation, a technology consulting firm, software makers sold an estimated $14 million in blocking software last year, and over the next three years, sales of blocking products are expected to grow to more than $75 million.

An increasing number of city and county library boards have recently forced libraries to install blocking programs, over the objections of the American Library Association and library patrons, and the use of blocking software in libraries is fast becoming the biggest free speech controversy since the legal challenge to the CDA.

How Does Blocking Software Work?

The best known Internet platform is the World Wide Web, which allows users to search for and retrieve information stored in remote computers. The Web currently contains over 100 million documents, with thousands added each day. Because of the ease with which material can be added and manipulated, the content on existing Web sites is constantly changing. Links from one computer to another and from one document to another across the Internet are what unify the Web into a single body of knowledge, and what makes the Web unique.

To gain access to the information available on the Web, a person uses a Web "browser" -- software such as Netscape Navigator or Microsoft's Internet Explorer -- to display, print and download documents. Each document on the Web has an address that allows users to find and retrieve it.

A variety of systems allow users of the Web to search for particular information among all of the public sites that are part of the Web. Services such as Yahoo, Magellan, Alta Vista, Webcrawler, Lycos and Infoseek provide tools called "search engines." Once a user has accessed the search service she simply types a word or string of words as a search request and the search engine provides a list of matching sites.

Blocking software is configured to hide or prevent access to certain Internet sites. Most blocking software comes packaged in a box and can be purchased at retail computer stores. It is installed on individual and/or networked computers that have access to the Internet, and works in conjunction with a Web browser to block information and sites on the Internet that would otherwise be available.

What Kind of Speech is Being Blocked?

Most blocking software prevents access to sites based on criteria provided by the vendor. To conduct site-based blocking, a vendor establishes criteria to identify specified categories of speech on the Internet and configures the blocking software to block sites containing those categories of speech. Some Internet blocking software blocks as few as six categories of information, while others block many more.

Blocked categories may include hate speech, criminal activity, sexually explicit speech, "adult" speech, violent speech, religious speech, and even sports and entertainment.

Using its list of criteria, the software vendor compiles and maintains lists of "unacceptable" sites. Some software vendors employ individuals who browse the Internet for sites to block. Others use automated searching tools to identify which sites to block. These methods may be used in combination. (Examples of blocked sites can be found below and in the Appendix.)

Typical examples of blocked words and letters include "xxx," which blocks out Superbowl XXX sites; "breast," which blocks website and discussion groups about breast cancer; and the consecutive letters "s," "e" and "x," which block sites containing the words "sexton" and "Mars exploration," among many others. Some software blocks categories of expression along blatantly ideological lines, such as information about feminism or gay and lesbian issues. Yet most websites offering opposing views on these issues are not blocked. For example, the same software does not block sites expressing opposition to homosexuality and women working outside the home.

Clearly, the answer to blocking based on ideological viewpoint is not more blocking, any more than the answer to unpopular speech is to prevent everyone from speaking, because then no viewpoint of any kind will be heard. The American Family Association, a conservative religious organization, recently learned this lesson when it found that CyberPatrol, a popular brand of blocking software, had placed AFA on its "Cybernot" list because of the group's opposition to homosexuality.

AFA's site was blocked under the category "intolerance," defined as "pictures or text advocating prejudice or discrimination against any race, color, national origin, religion, disability or handicap, gender or sexual orientation. Any picture or text that elevates one group over another. Also includes intolerance jokes or slurs." Other "Cybernot" categories include "violence/profanity," "nudity," "sexual acts," "satanic/cult," and "drugs/drug culture."

In a May 28th news release excoriating CyberPatrol, AFA said, "CyberPatrol has elected to block the AFA website with their filter because we have simply taken an opposing viewpoint to the political and cultural agenda of the homosexual rights movement." As one AFA spokesman told reporters, "Basically we're being blocked for free speech."

The AFA said they are planning to appeal the blocking decision at a June 9th meeting of CyberPatrol's Cybernot Oversight Committee, but expressed doubt that the decision would be overturned. The conservative Family Research Council also joined in the fight, saying they had "learned that the Gay Lesbian Alliance Against Defamation (GLAAD) is a charter member of Cyber Patrol's oversight committee," and that "it was pressure by GLAAD that turned Cyber-Patrol around."

Until now, AFA, FRC and similar groups had been strong advocates for filtering software, and AFA has even assisted in the marketing of another product, X-Stop. AFA has said that they still support blocking but believe their group was unfairly singled out.

Indeed, as the AFA and others have learned, there is no avoiding the fact that somebody out there is making judgments about what is offensive and controversial, judgments that may not coincide with their own. The First Amendment exists precisely to protect the most offensive and controversial speech from government suppression. If blocking software is made mandatory in schools and libraries, that "somebody" making the judgments becomes the government.

To Block or Not to Block: You Decide

According to a recent story in The Washington Post, a software vendor's "own test of a sample of Web sites found that the software allowed pornographic sites to get through and blocked 57 sites that did not contain anything objectionable."

And in a current lawsuit in Virginia over the use of blocking software in libraries, the ACLU argues that the software blocks "a wide variety of other Web sites that contain valuable and constitutionally protected speech, such as the entire Web site of Glide Memorial United Methodist Church, located in San Francisco, California, and the entire Web site of The San Francisco Chronicle." (For more on the lawsuit, see page 10.)

Following are real-world examples of the kind of speech that has been found to be inaccessible in libraries where blocking software is installed. Read through them -- or look at them online -- and then decide for yourself: Do you want the government telling you whether you can access these sites in the library?

www.afa.net

The American Family Association is a non-profit group founded in 1977 by the Rev. Donald Wildmon. According to their website, the AFA "stands for traditional family values, focusing primarily on the influence of television and other media - including pornography - on our society."

www.cmu.edu

Banned Books On-Line offers the full text of over thirty books that have been the object of censorship or censorship attempts, from James Joyce's Ulysses to Little Red Riding Hood.

www.quaker.org

The Religious Society of Friends describes itself as "an Alternative Christianity which emphasizes the personal experience of God in one's life." Their site boasts the slogan, "Proud to Be Censored by X-Stop, a popular brand of blocking software."

www.safersex.org

The Safer Sex Page includes brochures about safer sex, HIV transmission, and condoms, as well as resources for health educators and counselors. X Stop, the software that blocks these pages, does not block the "The Safest Sex Home Page," which promotes abstinence before marriage as the only protection against sexually transmitted diseases.

www.iatnet.com/aauw

The American Association of University Women Maryland provides information about its activities to promote equity for women. The Web site discusses AAUW's leadership role in civil rights issues; work and family issues such as pay equity, family and medical leave, and dependent care; sex discrimination; and reproductive rights.

www.sfgate.com/columnists/morse

Rob Morse, an award-winning columnist for The San Francisco Examiner, has written more than four hundred columns on a variety of issues ranging from national politics, homelessness, urban violence, computer news, and the Superbowl, to human cloning. Because his section is considered off limits, the entire www.sfgate.com site is blocked to viewers.

http://www.youth.org/yao/docs/books.html

Books for Gay and Lesbian Teens/Youth provides information about books of interest to gay and lesbian youth. The site was created by Jeremy Meyers, an 18-year-old senior in high school who lives in New York City. X-Stop, the software that blocks this page, does not block web pages condemning homosexuality.

www.sfgate.com

This website is the home of Sergio Arau, a Mexican painter, composer, and musician, who has been called one of Mexico's most diversely talented artists. He has recorded several successful compact disks, including a recent release on Sony Records, and his paintings have been exhibited in numerous museums and galleries, including the Museo Rufino Tamayo in Mexico City.

www.spectacle.org

The Ethical Spectacle is a free online magazine that addresses issues at the intersection of ethics, law, and politics in American life. Jonathan Wallace, the creator of the site, is also co-author with Mark Mangan of Sex, Laws, and Cyberspace, which received much critical praise and is widely available in libraries and book stores around the country.

In addition to these examples, a growing body of research compiled by educators, public interest organizations and other interested groups demonstrates the extent to which this software inappropriately blocks valuable, protected speech, and does not effectively block the sites they claim to block. A list of these studies can be found in Appendix I.

Questions and Answers About Blocking Software

Teaching Responsibility: Solutions that Work...

Instead of requiring unconstitutional blocking software, schools and libraries should establish content-neutral rules about when and how young people should use the Internet, and hold educational seminars on responsible use of the Internet.

For instance, schools could request that Internet access be limited to school-related work and develop carefully worded acceptable use policies (AUPs), that provide instructions for parents, teachers, students, librarians and patrons on use of the Internet. (See Appendix III for information about AUPs and other alternatives to blocking software.)

Successful completion of a seminar similar to a driver's education course could be required of minors who seek Internet privileges in the classroom or library. Such seminars could emphasize the dangers of disclosing personally identifiable information such as one's address, communicating with strangers about personal or intimate matters, or relying on inaccurate resources on the Net.

Whether the use of blocking software is mandatory or not, parents should always be informed that blind reliance on blocking programs cannot effectively safeguard children.

Libraries can and should take other actions that are more protective of online free speech principles. For instance, libraries can publicize and provide links to particular sites that have been recommended for children.

Not all solutions are necessarily "high tech." To avoid unwanted viewing by passers-by, for instance, libraries can place privacy screens around Internet access terminals in ways that minimize public view. Libraries can also impose content-neutral time limits on Internet use.

These positive approaches work much better than restrictive software that works only when students are using school or library computers, and teaches no critical thinking skills. After all, sooner or later students graduate to the real world, or use a computer without blocking software. An educational program could teach students how to use the technology to find information quickly and efficiently, and how to exercise their own judgment to assess the quality and reliability of information they receive.

...and Don't Work

In an effort to avoid installing blocking software, some libraries have instituted a "tap on the shoulder" policy that is, in many ways, more intrusive and unconstitutional than a computer program. This authorizes librarians to peer at the patron's computer screen and tap anyone on the shoulder who is viewing "inappropriate" material.

The ACLU recently contacted a library in Newburgh, New York to advise against a proposed policy that would permit librarians to stop patrons from accessing "offensive" and "racially or sexually inappropriate material." In a letter to the Newburgh Board of Education, the ACLU wrote: "The Constitution protects dirty words, racial epithets, and sexually explicit speech, even though that speech may be offensive to some." The letter also noted that the broad language of the policy would allow a librarian to prevent a patron from viewing on the Internet such classic works of fiction as Chaucer's Canterbury Tales and Mark Twain's Adventures of Huckleberry Finn, and such classic works of art as Manet's Olympia and Michelangelo's David.

"This thrusts the librarian into the role of Big Brother and allows for arbitrary and discriminatory enforcement since each librarian will have a different opinion about what is offensive," the ACLU said.

The First Amendment prohibits librarians from directly censoring protected speech in the library, just as it prevents indirect censorship through blocking software.

Battling Big Brother in the Library

In Loudoun County, Virginia, the ACLU is currently involved in the first court challenge to the use of blocking software in a library. Recently, the judge in that case forcefully rejected a motion to dismiss the lawsuit, saying that the government had "misconstrued the nature of the Internet" and warning that Internet blocking requires the strictest level of constitutional scrutiny. The case is now set to go to trial this fall.

Earlier this year, the ACLU was involved in a local controversy over the mandatory use of Internet blocking programs in California's public libraries. County officials had decided to use a blocking program called "Bess" on every library Internet terminal, despite an admission by Bess's creators that it was impossible to customize the program to filter only material deemed "harmful to minors" by state law.

After months of negotiation, the ACLU warned the county that it would take legal action if officials did not remove Internet blocking software from public library computers. Ultimately, the library conceded that the filters presented an unconstitutional barrier to patrons seeking access to materials including legal opinions, medical information, political commentary, art, literature, information from women's organizations, and even portions of the ACLU Freedom Network website.

Today, under a new policy, the county provides a choice of an unfiltered or a filtered computer to both adult and minor patrons. No parental consent will be required for minors to access unfiltered computers.

The ACLU has also advocated successfully against mandatory blocking software in libraries in San Jose and in Santa Clara County, California. The ACLU continues to monitor the use of blocking software in many libraries across the nation, including communities in Massachusetts, Texas, Illinois, Ohio and Pennsylvania.

The Fight in Congress: Marshaling the Cyber-Troops Against Censorship In February of this year, Senator John McCain (R-AZ) introduced the "Internet School Filtering Act," a law that requires all public libraries and schools to use blocking software in order to qualify for "e-rate," a federal funding program to promote universal Internet access. An amendment that would have allowed schools and libraries to qualify by presenting their own plan to regulate Internet access -- not necessarily by commercial filter -- failed in committee.

Another bill sponsored by Senator Dan Coats (R-IN) was dubbed "Son of CDA," because much of it is identical to the ill-fated Communications Decency Act.

The ACLU and others are lobbying against these bills, which have not yet come up for a vote as of this writing.

Censorship in the States: A Continuing Battle

Federal lawmakers are not the only politicians jumping on the censorship bandwagon. In the last three years, at least 25 states have considered or passed Internet censorship laws. This year, at least seven states are considering bills that require libraries and/or schools to use blocking software.

This censorship laws have not held up to constitutional scrutiny. Federal district courts in New York, Georgia and Virginia have found Internet censorship laws unconstitutional on First Amendment grounds in challenges brought by the ACLU. In April, the ACLU filed a challenge to an Internet censorship law in New Mexico that is remarkably similar to the failed New York law.

Conclusion

The advent of new forms of communication technology is always a cause for public anxiety and unease. This was as true for the printing press and the telephone as it was for the radio and the television. But the constitutional ideal is immutable regardless of the medium: a free society is based on the principle that each and every individual has the right to decide what kind of information he or she wants -- or does not want -- to receive or create. Once you allow the government to censor material you don't like, you cede to it the power to censor something you do like -- even your own speech.

Censorship, like poison gas, can be highly effective when the wind is blowing the right way. But the wind has a way of shifting, and sooner or later, it blows back upon the user. Whether it comes in a box or is accessed online, in the hands of the government, blocking software is toxic to a democratic society.

Questions and Answers about Blocking Software

In the interest of "unblocking" the truth, here are answers to some of the questions the ACLU most often encounters on the issue of blocking software:

Q: Why does it matter whether Internet sites are blocked at the library when people who want to see them can just access them at home?

A: According to a recent Nielsen Survey, 45 percent of Internet users go to public libraries for Internet access. For users seeking controversial or personal information, the library is often their only opportunity for privacy. A Mormon teenager in Utah seeking information about other religions may not want a parent in the home, or a teacher at school, looking over her shoulder as she surfs the web.P

Q: What about library policies that allow patrons to request that certain sites be unblocked?

A: The stigma of requesting access to a blocked site deters many people from making that request. Library patrons may be deterred from filling out a form seeking access, because the sites they wish to visit contain sensitive information. For instance, a woman seeking to access the Planned Parenthood website to find out about birth control may feel embarrassed about justifying the request to a librarian.

Q: But as long as a library patron can ask for a site to be unblocked, no one's speech is really being censored, right?

A: Wrong. Web providers who want their speech to reach library patrons have no way to request that their site be unblocked in thousands of libraries around the country. They fear patrons will be stigmatized for requesting that the site be unblocked, or simply won't bother to make the request. If public libraries around the country continue to use blocking software, speakers will be forced to self-censor in order to avoid being blocked in libraries.

Q: Isn't it true that librarians can use blocking software in the same way they select books for circulation?

A: The unique nature of the Internet means that librarians do not have to consider the limitations of shelf space in providing access to online material. In a recent ruling concerning the use of blocking software in Virginia libraries, a federal judge agreed with the analogy of the Internet as "a collection of encyclopedias from which defendants [the government] have laboriously redacted [or crossed out] portions deemed unfit for library patrons."

Q: Doesn't blocking software help a librarian control what children see online?

A: The ability to choose which software is installed does not empower a school board or librarian to determine what is "inappropriate for minors." Instead, that determination is made by a software vendor who regards the lists of blocked sites as secret, proprietary information.

Q: Why shouldn't librarians be involved in preventing minors from accessing inappropriate material on the Internet?

A: It is the domain of parents, not librarians, to oversee their children's library use. This approach preserves the integrity of the library as a storehouse of ideas available to all regardless of age or income. As stated by the American Library Association's Office of Intellectual Freedom: "Parents and only parents have the right and responsibility to restrict their own children's access -- and only their own children's access -- to library resources, including the Internet. Librarians do not serve in loco parentis."

Q: What do librarians themselves think about blocking software?

A: The overwhelming majority of librarians are opposed to the mandatory use of blocking software. However some, under pressure from individuals or local officials, have installed blocking software. The ALA has a Library Bill of Rights, which maintains that filters should not be used "to block access to constitutionally protected speech."

Q: Isn't blocking software an inexpensive way for libraries to monitor Internet use?

A: While parents may be able to purchase a blocking program for around $40, the cost for library systems is much greater. One library has estimated the initial installation of blocking software at $8,000 plus an additional $3,000 a year to maintain. As the court noted in ongoing case in Virginia case, "it costs a library more to restrict the content of its collection by means of blocking software than it does for the library to offer unrestricted access to all Internet publications."

Q: Are libraries required to use blocking software in order to avoid criminal liability for providing minors access to speech that may not be protected by the Constitution?

A: No. The First Amendment prohibits imposing criminal or civil liability on librarians merely for providing minors with access to the Internet. The knowledge that some websites on the Internet may contain "harmful" matter is not sufficient grounds for prosecution. In fact, an attempt to avoid any liability by installing blocking software or otherwise limiting minors' access to the Internet would, itself, violate the First Amendment.

Q: Would libraries that do not use blocking software be liable for sexual harassment in the library?

A: No. Workplace sexual harassment laws apply only to employees, not to patrons. The remote possibility that a library employee might inadvertently view an objectionable site does not constitute sexual harassment under current law.

Q: Can't blocking programs be fixed so they block only illegal speech that is not protected by the Constitution?

A: There is simply no way for a computer software program to make distinctions between protected and unprotected speech. This is not a design flaw that may be "fixed" at some future point but a simple human truth. (For more on this subject, see Appendix II.)

Q: What if blocking software is only made mandatory for kids?

A: Even if only minors are forced to use blocking programs, constitutional problems remain. The Supreme Court has agreed that minors have rights too, and the fact that a 15-year-old rather than an 18 year-old seeks access online to valuable information on subjects such as religion or gay and lesbian resources does not mean that the First Amendment no longer applies. In any case, it is impossible for a computer program to distinguish what is appropriate for different age levels, or the age of the patron using the computer.

Q: Is using blocking software at schools any different than using it in public libraries?

A: Unlike libraries, schools do act in place of parents, and play a role in teaching civic values. Students do have First Amendment rights, however, and blocking software is inappropriate, especially for junior and high school students.

In addition, because the software often blocks valuable information while allowing access to objectionable material, parents are given a false sense of security about what their children are viewing. A less restrictive -- and more effective -- alternative is the establishment of content-neutral "Acceptable Use Policies" (AUPs). (See Appendix III).

Q: Despite all these problems, isn't blocking software worth it if it keeps some pornography from reaching kids?

A: Even though sexually explicit sites only make up a very small percentage of content on the Internet, it is impossible for any one program to block out every conceivable web page with "inappropriate" material.

When blocking software is made mandatory, adults as well as minors are prevented from communicating online, even in schools. According to a recent news story in the Los Angeles Times, a restrictive blocking program at a California school district meant coaches couldn't access the University of Notre Dame's website, and math instructors were cut off from information about Wall Street because of a block on references to money and finance.

Q: Does this mean that parents can't use blocking software in the home?

A: No. The ACLU believes that parents have a right to use -- or not use -- whatever blocking software they choose.

Appendix I

Following are brief descriptions of some recent studies and reports addressing specific problems with blocking software.

http://www2.epic.org/reports/filter_report.html

Faulty Filters: How Content Filters Block Access to Kid-Friendly Information on the Internet, reviewed the impact of a "family-friendly" search engine from the NetShepherd Corporation. The report, released by the Electronic Privacy Information Center (EPIC), compared 100 search results using the unfiltered AltaVista search engine and using AltaVista in conjunction with the NetShepherd search engine. EPIC found that NetShepherd typically blocked access to 95-99 percent of material available on the net that might be of interest to young people -- including the American Red Cross, the San Diego Zoo, and the Smithsonian Institution. At the time EPIC's report was written, Net Shepherd claimed that it had reviewed "97% of the English language sites on the Web," a claim that was later retracted.

http://www.glaad.org

Access Denied, a report by the Gay and Lesbian Alliance Against Defamation (GLAAD) concludes that most blocking products categorize and block all information about gays and lesbians in the same manner that they block sexually explicit and pornographic material. For instance, the report noted that the rating program SurfWatch blocked online sites such as the International Association of Gay Square Dance Clubs, the Queer Resources Directory, the Lesbian/Gay/Bisexual Association of the University of California at Berkeley, and the Maine Gay Network.

http://www.spectacle.org/cwp/

Blacklisted by CyberPatrol, a report by the Censorware Project, found that CyberPatrol software blocks tens of thousands of web pages with innocent content, simply because a few users linked to more sexually explicit web pages. The report also shows that wrongfully blocked sites are often inaccurately described by CyberPatrol. For instance, "Full Nude Sex Acts" was used to describe websites for the U.S. Army Corps of Engineers Construction Engineering Research Laboratories, Cafe Haven at Brigham Young University, a server at the Japan Institute of Technology in Chiba, Japan, and the Department Of Computer Science at Queen Mary Westfield College. None of these websites were found to contain explicit material.

The Internet Free Expression Alliance, of which the ACLU is a founding member, is an excellent resource for more links to studies, background information and news articles on Internet censorship. The website is http://www.ifea.net.

Appendix II: Excerpts from Computer Professionals for Social Responsibility

<b>Filtering FAQP </b>

<b>What is a content filter?</b>

A content filter is one or more pieces of software that work together to prevent users from viewing material found on the Internet. This process has two components.

<b>Rating:</b> Value judgments are used to categorize websites based on their content. These ratings could use simple allowed/disallowed distinctions like those found in programs like CyberSitter or NetNanny, or they can have many values, as seen in ratings systems based on Platform for Internet Content Selection (PICS).

<b>Filtering:</b> With each request for information, the filtering software examines the resource that the user has requested. If the resource is on the "not allowed" list, or if it does not have the proper PICS rating, the filtering software tells the user that access has been denied and the browser does not display the contents of the web site.

The first content filters were stand-alone systems consisting of mechanisms for determining which sites should be blocked, along with software to do the filtering, all provided by a single vendor.

The other type of content filter is protocol-based. These systems consist of software that uses established standards for communicating ratings information across the Internet. Unlike stand-alone systems, protocol-based systems do not contain any information regarding which sites (or types of sites) should be blocked. Protocol-based systems simply know how to find this information on the Internet, and how to interpret it.

<b>Can filtering programs be turned off?</b>

It is assumed that parents or other authoritative users who install filtering programs would control the passwords that allow the programs to be disabled. This means that parents can enable the filter for their children but disable it for themselves. As with all other areas of computer security, these programs are vulnerable to attack by clever computer users who may be able to guess the password or to disable the program by other means.

<b>I don't want to filter, but I do want to know what my child is viewing. Is that possible?</b>

Some products include a feature that will capture the list of all Internet sites that have been visited from your computer. This allows a parent to see what sites their child has viewed, albeit after the fact. Similar software allows employers to monitor the Internet use of their employees. Users of these systems will not know that their Internet use is being watched unless they are explicitly told. Whether used in homes or workplaces, these tools raise serious privacy concerns.

<b>What is the scope of Internet content filtering? Do filters cover the WWW? Newsgroups? IRC? Email?</b>

While some stand-alone systems claim to filter other parts of the Internet, most content filters are focused on the World Wide Web. Given the varied technical nature of the protocols involved, it's likely that filtering tools will do well with some of these, and poorly with others. For example, filtering software can easily block access to newsgroups with names like "alt.sex". However, current technology cannot identify the presence of explicit photos in a file that's being transferred via FTP. PICS-based systems currently only filter web sites.

Stand-alone Systems

<b>What is a stand-alone system?</b>

A stand-alone filtering system is a complete filtering solution provided by a single vendor. These filters block sites based on criteria provided by the software vendor, thus "locking in" users. If a customer does not like the vendor's selection of sites that are to be blocked, she must switch to a different software product.

<b>Who decides what gets blocked and what doesn't?</b>

This is the biggest practical difference between stand-alone systems and protocol-based systems. Stand-alone systems limit users to decisions made by the software vendor, although some let the parents or installers remove sites. Protocol-based systems provide users with a choice between alternative ratings systems, which publishers and third parties can use to develop ratings for content.

<b>How do stand-alone programs determine what should be blocked?</b>

Currently available filtering tools use some combination of two approaches to evaluate content: lists of unacceptable (or acceptable) sites, and keyword searches.P List-based blocking works by explicitly enumerating sites that should either be blocked or allowed. These lists are generally provided by filter vendors, who search for sites that meet criteria for being classified as either "objectionable" or "family-friendly."

Filtering software vendors vary greatly in the amount of information and control they make available to users. Most vendors do not allow users to see the actual list of blocked sites, as it is considered to be a kind of trade secret. However, some vendors provide detailed descriptions of the criteria used to determine which sites should be blocked. Some vendors might allow users to add sites to the list, either in their own software or by sending sites to the vendor for review.

Keyword-based blocking uses text searches to categorize sites. If a site contains objectionable words or phrases, it will be blocked.

<b>What's wrong with list-based filtering?</b>

There are several problems with filtering based on lists of sites to be blocked.

First, these lists are incomplete. Due to the decentralized nature of the Internet, it's practically impossible to definitively search all Internet sites for "objectionable" material. Even with a paid staff searching for sites to block, software vendors cannot hope to identify all sites that meet their blocking criteria. Furthermore, since new websites are constantly appearing, even regular updates from the software vendor will not block out all adult websites. Each updated list will be obsolete as soon as it is released, as any as any site that appears after the update will not be on the list, and will not be blocked. The volatility of individual sites is yet another potential cause of trouble. Adult material might be added to (or removed from) a site soon after the site is added to (or removed from) a list of blocked sites.

Blocking lists also raise problems by withholding information from users, who may or may not have access to information describing the criteria used to block websites. While some vendors provide descriptions of their blocking criteria, this information is often vague or incomplete. Several vendors have extended blocking beyond merely "objectionable" materials. In some instances, political sites and sites that criticize blocking software have been blocked.

This obscurity is compounded by practices used to protect these lists of blocked sites. Vendors often consider these lists to be proprietary intellectual property, which they protect through mathematical encryption, which renders the lists incomprehensible to end users. As a result, users are unable to examine which sites are blocked and why. This arbitrary behavior demeans the user's role as an active, thoughtful participant in their use of the Internet.

<b>What's wrong with filtering based on keyword searches?</b>

Keyword searching is a crude and inflexible approach that is likely to block sites that should not be blocked while letting "adult" sites pass through unblocked. These problems are tied to two shortcomings of this approach:

Keyword searches cannot use contextual information. While searches can identify the presence of certain words in a text, they cannot evaluate the context in which those words are used. For example, a search might find the word "breast" on a web page, but it cannot determine whether that word was used in a chicken recipe, an erotic story, or in some other manner. In one notable incident, America Online's keyword searches blocked a breast cancer support group. Keyword searches cannot interpret graphics. It is not currently possible to "search" the contents of a picture. Therefore, a page containing sexually explicit pictures will be blocked only if the text on that page contains one or more words from the list of words to be blocked.

Version 1.1 12/25/97<br> Excerpted from CPSR faq <br> Written by Harry Hochheiser<br> CPSR Board Member <br> [email protected]<br> www.cpsr.org

 
To the best of our knowledge, the text on this page may be freely reproduced and distributed.
If you have any questions about this, please check out our Copyright Policy.

 

totse.com certificate signatures
 
 
About | Advertise | Bad Ideas | Community | Contact Us | Copyright Policy | Drugs | Ego | Erotica
FAQ | Fringe | Link to totse.com | Search | Society | Submissions | Technology
Hot Topics
ebay contest
4chan on fox
This woman is drop dead gorgeous
Kid Eats Habanero Pepper
Were no longer the olny Totse
How to improve yoursocial/sex life
Google anti piracy tool for youtube :(
Chick caught on webcam
 
Sponsored Links
 
Ads presented by the
AdBrite Ad Network

 

TSHIRT HELL T-SHIRTS