Search This Blog

A Word About Our Blog Entries

The Julie Group shares a professional interest in the area of digital and emerging technology and law. As professionals there is a rich and deep appreciation for the differences of opinion that can appear in this space. You must never assume that opinion, where it is introduced is universally shared and endorsed by all our volunteers. Nor are they necessarily the very best snapshot of a given issue.

Readers are expected to think about the issues, question everything worth discussing, and add value to the conversation by correcting what's here or broadening the understanding of the subject. This is part of the educational process between us all. Our hope is that this exercise results in better law, law enforcement, and citizen participation in forging sophisticated social understandings of the technological forces changing our lives.

Friday, August 3, 2007

The Child Safe Viewing Act of 2007 Tackles the Wrong Problem

The Senate Commerce Committee today passed the Child Safe Viewing Act of 2007 (S. 602), which requires the Federal Communications Commission (FCC) to study the "existence and availability" of filtering technologies for audio and video content transmitted over "wired, wireless, and Internet" platforms, as well as other devices.

The stated purpose of the bill is to develop the "next generation of parental control technology."

While I have no objection to parental controls and filters overall, I am concerned that there is an over-reliance on these devices as a method to shield children from objectionable content while shifting the focus away from the source of the content -- those who create, distribute, and profit from pornography by employing deceptive marketing techniques, adware, and malware as their primary method of distribution.

There's also nothing wrong with studying different methods to protect children. However, this bill calls for the FCC to "consider measures to encourage or require the use of advanced blocking technologies that are compatible with various communications devices or platforms." It goes on to define those platforms as "wired, wireless, and Internet platforms."

The Center for Democracy and Technology released this statement earlier this week in response to the language contained in the measure:



The Senate Commerce Committee today passed the Child Safe Viewing Act of 2007 (S. 602), which requires the Federal Communications Commission (FCC) to study the "existence and availability" of filtering technologies for audio and video content transmitted over "wired, wireless, and Internet" platforms, as well as other devices. CDT does not oppose a purely fact-finding study, but maintains that a neutral, non-regulatory body such as the National Academy of Sciences would be better suited to such a project. More importantly, CDT is concerned that this legislation may represent a step toward expanding the FCC's censorship authority to include Internet content.

This measure aims at the wrong end of the beast. It would be far more productive for the lawmakers in this country to concentrate their resources at the creators and beneficiaries of illegal content on the Internet and leave decisions regarding what parents do and do not wish for their children to view to the parents. This is the view of the Progress & Freedom Foundation as well, as expressed in their paper released earlier this week discussing the implications of content regulation. From their press release:

In his paper, Thierer asserts that there is no need for Congress or the FCC to mandate tools that already exist. Moreover, if the FCC starts "approving" certain technologies, it is likely to slow the development of new blocking technologies.

Thierer's view is that the measure, while well intentioned, would create unnecessary regulation where there is no market failure at work.As he pointed out in his recent Progress and Freedom Foundation book, Parental Controls and Online Child Protection: A Survey of Tools and Methods, "There has never been a time in our nation's history when parents have had more tools and methods at their disposal to help them decide what is acceptable in their homes and in the lives of their children."


Everyone wants to protect children from rogues, no matter what space they inhabit. That's why I taught my kids not to talk to strangers, not to walk anywhere by themselves, and other basic safety measures to keep them safe on city streets, playgrounds, and even their front yard. Safe internet, television and wireless phone use is also a parental responsibility. What I rely on lawmakers and associated agencies to do is enforce the laws to hold the lawbreakers -- those who create, produce and distribute harmful content -- accountable for the harm they do with their selfish acts.

Filtering content on a mandatory basis pushes all of us down the slippery slope of accepting others' value judgments about what is, and what isn't "dangerous content" that should be subject to blocking. This measure calls for the FCC to craft rules that ignore existing rating systems in favor of defining the scope of the filters through some other subjective basis.

Will the art of the Old Masters be deemed to fall into the category of objectionable content? There are some who would say that it should be included if it includes depictions of genitalia, while others (including me) would not. What about sites which discuss legitimate issues surrounding sexuality and relationships? How would mandatory filtering be enforced? Through the telecommunications and cable companies or by the individual?

Filtering technologies have proven inherent flaws that will not disappear, simply because human communication is too complex and too nuanced to be fed into a database of prohibitions. For every rule there are 10 exceptions. The appropriate approach is to identify and prosecute creators and distributors of clearly illegal content, and exercise parental discretion and supervision over Internet, wireless and television viewing.

What do you think? Do filters work? Parents, how do you keep your children protected on the Internet?


The opinions expressed in this post are those of the author and should not be interpreted as an official position of The Julie Group


--Karoli

Bonus Link: Berkman Center for Internet and Technology: 2007 OpenNet Initiative

Thursday, August 2, 2007

A Curious Conundrum

There I was, minding my own business - well, complaining about Adware - when the verdict from the Chris Langham trial popped onto the TV. Though I missed the bulk of the news report, I did pick up one of the legal type guys saying something that can be summed up like this:

"These sort of images (child pornography) don't just pop onto your screen. You have to go looking for it".

Now, it's clear that in this case, Langham did indeed "go looking for it", though only he can truly know if the motive was "research" or not.

However. The above statement seemed to come across as a more general, sweeping, non-Chris Langham specific statement.

This is sort of dangerous.

Already, we've had a web browser that fires up illegal porn without the permission of the person using it. There are currently plans in the UK to make certain kinds of extreme porn - rape, for example - illegal. But here's a hijack from a few days ago involving supposedly normal blogs immediately redirecting the end-user to rape porn.

Bam, just like that. Have some potentially illegal porn, on the house. Quick, free, easy and totally without warning.

I have little faith in UK law at the best of times, especially where cases of a technological nature are concerned. To put out the message that you "have to go looking for it" is spurious at best, and not a mindset the public at large should be fooled into believing.

I never liked TV anyway.