The Record (Bergen County, NJ)

Do Internet filters even work?

by Howard Fienberg
March 23, 2001

We are rightly concerned about the mass of disturbing and obscene material on the Internet, but unsure of how to protect our children from it. Many people favor filters (often called “censorware”). Congress passed the Children Internet Protection Act (CIPA) in December to require federally-funded libraries and schools to use filters on their computers, or risk losing much of their funding. On March 20, various parties challenged the CIPA in a federal court. Now it is time to ask the most important question: do Internet filters even work?

There are three basic ways to filter the Internet: software analysis, a site labeling system and human analysis. Software analysis leaves the filtering to a computer program which searches sites for objectionable text or images. The site labeling system relies on web site owners to voluntarily label their sites’ content. Human analysis relies on a central staff which reviews web sites, compiling lists of approved and unapproved sites.

Unfortunately, none of these methods is particularly effective.

Automated software will prohibit certain words or sites that contain them, regardless of the words’ context. The Digital Freedom Network (DFN) held a contest this past fall seeking the most outrageous examples of such failed filtering. The grand prize went to the Carroll High School library, since the school’s web site could not be accessed through the school’s own library computers. Apparently, the site was blocked for containing the suggestive word “high.” The web site of Representative Dick Armey, Majority leader of the House of Representatives, was given the Poetic Justice Award. His site was blocked, by many of the same filter software that the DFN feels he so actively supports, because of his offensive first name.

The site labeling system might be promising, given that the ratings of the Internet Content Ratings Association are becoming a popular standard. Unfortunately, the system relies on both the participation of site owners and their honesty. While all unrated sites can be blocked, that would cordon off much of the inoffensive Internet as well.

Human analysis may be the best tool for filtering, since on a case-by-case basis real people can discern the context of images and text. But faced with a massive number of web sites (some of which might only be found through blind luck), attempting to do anything more than scratch the surface could take years. In addition, this method allows the most subjective variance in filter judgement. Since filtering companies keep their blocked site lists secret, sites which may have been wrongfully blocked have difficulty disputing the decisions.

Artificial intelligence may some day be able to sensibly filter Internet content, but current censorware is nowhere near that ideal. Today’s censorware need not be completely dismissed, but its flaws must be understood. Tremendous improvements have been made in just the last few years. However, the adaptive nature of the Internet, which makes it an ever more useful tool, also makes it ever more difficult to tame - witness the burgeoning number of web sites offering ways to beat censorware.

So where does that leave us? At base, we need personal human oversight. Aside from all the content that gets censored that should not be, using filters offers a false sense of security. Censorware might be able to protect children from some “offensive” material, but only human beings can teach children how to discern truth from lies and authoritative information from pure rubbish. Perhaps the most useful, but understated, value in this technology is its “Big Brother” power - the capacity to keep track of where children surf on the Internet. For a parent who cannot spare the time to surf with their child, this software can provide a way to monitor the child’s behavior.

Censorware is more of a computerized “Big Brother” of last resort than a reliable safety net. While it has its value, human intervention remains the most important protection for our children.

-- Howard Fienberg is research analyst with the Statistical Assessment Service (STATS), a nonprofit nonpartisan think tank in Washington, D.C.

[See the original article at http://www.bergen.com/op-ed/stupid20010323.htm]

[Also ran April 14th in Liberzine, "Censorware can't replace parents ]


return to Howard Fienberg's page