"Over the last year, the American Civil Liberties Union has asked officials from hundreds of school districts around the country to make changes in their Internet screening systems to eliminate bias, said Anthony Rothert, a civil liberties lawyer based in St. Louis.

All have agreed to, he said, except Camdenton, which the A.C.L.U. sued last summer.

The lawsuit — believed to be the first of its kind — does not claim that this rural district of 4,200 students purchased the software with the intent of discriminating. Rather, it says, once there were complaints about the filter last year, school officials refused to replace it. An investigator for the A.C.L.U. has been able to figure out how the filter works, but not who developed it.

… Judge Laughrey noted that the URLBlacklist filter was even bad at doing its primary, legal job: blocking pornography. Tested on its ability to recognize 500 sexually explicit sites, it missed 30 percent of them. CIPAFilter, one of the leaders in the field, missed 3 percent.”

(via Library Stuff)

A federal judge ordered a central Missouri school district to stop using Internet filtering software that blocks access to gay, lesbian and transgender issue-related websites.

…  Judge Laughrey found that the school’s Internet filter, URL Blacklist, systematically discriminates against gay-friendly websites.

"Sexuality filters are normally used to filter out pornographic material, but the URL Blacklist filter has the affect of filtering out positive material about LGBT issues as well as pornographic material," Laughrey wrote. "PFLAG has identified forty-one websites blocked by URL Blacklist’s ‘sexuality’ filter that express a positive viewpoint toward LGBT individuals.  PFLAG tested these forty-one websites on five different Internet filter systems designed to help schools comply with CIPA [Children’s Internet Protection Act]. None of these five filter systems blocked any of these forty-one websites as prohibited by CIPA. On the other hand, URL Blacklist generally categorizes websites expressing a negative view toward LGBT individuals in its ‘religion’ category, and does not block them with its ‘sexuality’ filter. Thus, URL Blacklist systematically allows access to websites expressing a negative viewpoint toward LGBT individuals by categorizing them as ‘religion’, but filters out positive viewpoints toward LGBT issues by categorizing them as ‘sexuality’.”

(via Library Stuff, emphasis mine)

"The American Civil Liberties Union (ACLU) has just filed a complaint (PDF) on behalf of a Salem, Missouri resident named Anaka Hunter, who contends that the Salem public library is unconstitutionally blocking her ability to access information on ‘minority’ religious views. Federal and state law both govern libraries in Missouri, which are generally ordered to block access to obscene online material and child pornography. But the Salem library allegedly goes far beyond the mandate.

The library’s ‘Netsweeper’ content filtering system can block a huge variety of material, from porn to P2P to ‘occult’ to ‘criminal skills,’ but it’s up to the institution to choose which content categories will get filtered. Hunter claims that while looking into Native American and Wiccan religious practices, she was repeatedly halted by the filter’s ‘occult’ and ‘criminal skills’ categories. When she complained, she says that the library staff wasn’t especially helpful.”

(via Library Stuff)

A city councilor in Massachusetts thinks he’s come up with a way to stop people looking at pornography on public library computers — name them and shame them.
Quincy Councilor Daniel Raymondi has asked Mayor Thomas Koch to make public a list of people who have viewed pornography on library computers within the past year. The council unanimously approved a resolution on the idea last week.
Library director Ann McLaughlin tells The Patriot Ledger that using library computers to access porn is against policy, and violators are given two warnings before they are banned. She says she’s not sure publicly naming violators would work.
A spokesman for the mayor says the city’s legal department is reviewing Raymondi’s request. (via LISNews)

Presented without comment.

 "Last month, Google launched an encrypted version of its Web search, allowing users to enable a Secure Sockets Layer (SSL) connection to encrypt their information. Like several other Google products that feature SSL encryption, including email and Docs, Google touted this move as a step towards enhancing users’ privacy and security.

But as the encrypted searches mean that data cannot be logged, filtered, or blocked, Google’s new secure search runs afoul of CIPA, the Children’s Internet Protection Act. And with the service’s beta release, many schools are now facing some difficult decisions in how to respond.

CIPA requires schools to monitor, and in some cases block, certain websites. And while filtering is not necessarily a popular tactic (the American Library Association and the ACLU have sought to overturn the law), schools and libraries receiving federal E-rate funding must comply.” (via LISNews)

Why internet filters don’t work and why libraries who filter are wrong
ReadWriteWeb’s coverage brought up the ethical argument against filtering. Just because someone is using a library computer, does that mean that he or she automatically has less access to information? It shouldn’t, and libraries are fighting for information access rights every day.
Besides the ethical argument against filtering there are plenty of practical arguments. Namely, filters don’t work, they cost a lot of money, and take a lot of time to operate.
… Looking at our own library’s study as well as all of the published studies done in the last decade, it’s consistently found that 15-20% of the time, content is over-blocked (e.g. benign sites that are blocked incorrectly). And 15-20% of the time, content is under-blocked (e.g. sites deemed “bad” gets through anyway). This means that out of 100 interactions (website views, searches), 15-20 of them will be incorrectly allowed through and 15-20 of them will be incorrectly blocked. We found that overall, filters have only about a 60-70% accuracy rate for traditional text content.
… Filters simply do not work on multimedia content, which is usually what people think the filters are for (naughty videos and photos). The accuracy in filtering images, audio, video, RSS feeds, and social networking content is embarrassingly low: about 40%.

This article is a good librarian’s-eye view of the Supreme Court decision and the current filtering situation.  I recommend clicking through for a quick rundown of how filtering software really works—or more to the point, fails to work.
This article also mentions the point of my last post, which is the so-called “unblocking” process for adults.  Most patrons are too embarrassed to even request unblocking, and those who do apparently have to wait hours or even days for the library to enable the site that shouldn’t have been restricted in the first place.
*climbs off soapbox*

Why internet filters don’t work and why libraries who filter are wrong

ReadWriteWeb’s coverage brought up the ethical argument against filtering. Just because someone is using a library computer, does that mean that he or she automatically has less access to information? It shouldn’t, and libraries are fighting for information access rights every day.

Besides the ethical argument against filtering there are plenty of practical arguments. Namely, filters don’t work, they cost a lot of money, and take a lot of time to operate.

… Looking at our own library’s study as well as all of the published studies done in the last decade, it’s consistently found that 15-20% of the time, content is over-blocked (e.g. benign sites that are blocked incorrectly). And 15-20% of the time, content is under-blocked (e.g. sites deemed “bad” gets through anyway). This means that out of 100 interactions (website views, searches), 15-20 of them will be incorrectly allowed through and 15-20 of them will be incorrectly blocked. We found that overall, filters have only about a 60-70% accuracy rate for traditional text content.

… Filters simply do not work on multimedia content, which is usually what people think the filters are for (naughty videos and photos). The accuracy in filtering images, audio, video, RSS feeds, and social networking content is embarrassingly low: about 40%.

This article is a good librarian’s-eye view of the Supreme Court decision and the current filtering situation.  I recommend clicking through for a quick rundown of how filtering software really works—or more to the point, fails to work.

This article also mentions the point of my last post, which is the so-called “unblocking” process for adults.  Most patrons are too embarrassed to even request unblocking, and those who do apparently have to wait hours or even days for the library to enable the site that shouldn’t have been restricted in the first place.

*climbs off soapbox*

In a decision that may lead some libraries to adopt more stringent Internet filtering policies, the Supreme Court of Washington, in a 6-3 decision has agreed that a public library can filter Internet access for all patrons without disabling the filter on request of an adult library patron to allow access to websites with constitutionally protected material.

One fact I found especially surprising was the amount of time unblocking has traditionally taken: “The court noted that, of 92 unblocking requests, 27 were responded to in the same day, 29 the next day, 20 in three days, and five in a longer period. (There was no evidence about the other 11 requests.)”

I guess I’ve been naive, because through all the discussions of internet filtering in library school, I assumed a request to have a site unblocked was to be dealt with immediately.  Not the next day or maybe by your next trip to the library.  If there is filtering software on your computer, I think there should be a librarian on staff who knows how to unblock sites.  It’s a matter of user convenience.