CZ:Proposals/Should we reform our family-friendly policy?: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Stephen Ewen
imported>Stephen Ewen
Line 104: Line 104:
Filtering software is not something to be relied upon if your goal is maximum readership.  I run excellent filtering software at home and the rankest of the rankest Wikpedia articles are not filtered.  My oldest son's public high school - the entire school district - runs filtering software.  But it has also blocked Wikipedia totally because of lack of self-censorship there.
Filtering software is not something to be relied upon if your goal is maximum readership.  I run excellent filtering software at home and the rankest of the rankest Wikpedia articles are not filtered.  My oldest son's public high school - the entire school district - runs filtering software.  But it has also blocked Wikipedia totally because of lack of self-censorship there.


Look, ''every'' traditional print general encyclopedia has a "family friendly" policy.  This is to maximize utility to the most possible.  Citizendium should be no different.  Topics like [[sexual reproduction]] can most certainly be dealt with in a scholarly fashion, ''and'' not diminish information, ''and'' be "family friendly". [[User:Stephen Ewen|Stephen Ewen]] 21:34, 14 May 2008 (CDT)
Look, ''every'' traditional print general encyclopedia has a "family friendly" policy.  This is to maximize utility to the most possible.  Citizendium should be no different.  Topics like [[sexual reproduction]] can most certainly be dealt with in a scholarly fashion, ''and'' not diminish information, ''and'' be "family friendly".  
 
My own feeling is that this policy will naturally evolve in the context of concrete cases, in reference to the foundational "family friendly" policy.
 
[[User:Stephen Ewen|Stephen Ewen]] 21:34, 14 May 2008 (CDT)


{{Proposals navigation}}
{{Proposals navigation}}

Revision as of 20:41, 14 May 2008

This proposal has not yet been assigned to any decisionmaking group or decisionmaker(s).
The Proposals Manager will do so soon if and when the proposal or issue is "well formed" (including having a driver).
For now, the proposal record can be found in the new proposals queue.


Driver: Brian P. Long

Complete explanation

I think it's pretty clear that this is, in Citizendium parlance, an issue: there are people on both sides of this debate.

Option 1: Continuing Self-Censorship

The first option would be to maintain the status quo, either by leaving our Statement of Fundamental Policies and our policy pages untouched, or by rewriting the policies but keeping an implicit or explicit ban on profanity and violent or sexually explicit material. Making this determination for human sexuality articles would be particularly tricky, and we would likely need to spell out in advance that we will have an article on topic X but not topic Y.

Option 2: Overhauling the Policy

A good, general-purpose encyclopedia does not appeal primarily to the prurient interest, and it is worth keeping something on the books to this effect. (However, if Citizendium is set up to allow effective filtering, trying to police content should become a non-issue.)

Overhauling our content policy would mean moving away from self-censorship, and moving closer towards the Intellectual Freedom policy recommended by the American Library Association. That is, we should provide parents, teachers and librarians with the tools to select content for their children, students and patrons, but we should not seek to act in their stead.

A new policy might read: "Citizendium does not tolerate material with needlessly explicit language or images." The key word in this definition is "needlessly." An earnest article on gangsta rap will have very compelling reasons of fairness and accuracy for including profanity.

Reasoning

At the moment, we have not spelled out just what content is or is not permissible in our articles (i.e. nudity, profanity, graphic violence, human sexuality). At the same time, though, all users on Citizendium are required to sign on to it when they agree to our Statement of Fundamental Policies. Furthermore, "family-friendly" is a phrase with distinct ideological overtones, and has become something of a term of art.

Furthermore, our current policy has not been particularly successful. As of this writing, there are at least two pages with language that is explicit by any definition, and there have been a number of other cases where, after lengthy debate, content or discussion was deemed to be un-family-friendly by some contributors and not by others. There are situations where extended debate is productive and informative, but personal standards of decency are just that-- personal. Extended debate on these issues is a waste of time.

A school district or a public library is inextricably connected to a particular community. The school or library is funded by the community, and has a duty to be responsive to the needs and concerns of community members. Citizendium, on the other hand, is on the internet. We have no immediate community that will tell us when our content has violated the community's standards. We have, by contrast, contributors and readers from many different countries and communities around the world, where, as discussion on the forums has shown, standards differ widely. Parents, educators and librarians will need to make judgments about what content they find appropriate for their communities, but Citizendium as a whole should move away from self-censorship.

There is a tension between the desire to avoid offending some readers and the desire to write a bold, interesting encyclopedia. There is a class of literature which relies on bawdiness and vulgarity for effect; serious writers and translators no longer rely on Latinate language or asterisks, and neither should we. Analogous cases may be found in Art History and Music, and there is very little content we can provide on human sexuality without offending someone's sense of 'family-friendliness.'

Citizendium's goal, generally speaking, is to provide a free, reliable encyclopedia, but this goal is vitiated if our content is filtered and thereby inaccessible to secondary school students. A central component of this proposal is therefore to find a method of marking our potentially objectionable content so that it can be blocked by internet filtering software (8e6 & Bess at the school district level; NetNanny, Squidproxy & Norton for the home user).

Implementation

  1. If approved, rewriting policy pages
  2. Evaluating filtering software; figuring out how to integrate content ratings, so to speak, with MediaWiki software
  3. Implementing technical changes, if necessary
  4. Writing 'A Guide to Citizendium for Parents and Educators' (and maybe a separate guide for librarians)
  5. And after all of this, finally changing the policy!

Discussion

A discussion section, to which anyone may contribute.

Without speaking about more substantive matters, I want to address one meta-issue: we can talk about "reforming" the policy (depending on what is meant by that). But the policy will not be rejected: the basic concept is part of our fundamental policies, and the Editorial Council has no standing (if it ever will) to edit those policies. --Larry Sanger 21:23, 25 April 2008 (CDT)

I am by no means arguing that we scrap the policy, or that we reject the basic concept. Whether we're in agreement about other matters, I think we are in agreement about bearing the concerns of parents, educators and librarians in mind. Brian P. Long 21:38, 25 April 2008 (CDT)

You mention pages that have had problems concerning the family policy. I do not know of these, and I do not see how we can have a discussion without substantive material to discuss. If there have been actual cases then why have you not specified the exact article titles, and what conceptually seems to be the problem with each one? Otherwise, this is all theoretical and you cannot improve on the existing policy without going into detail.Martin Baldwin-Edwards 17:05, 3 May 2008 (CDT)

I assume you are referring to my comment in the 'Reasoning' section above-- I was obliquely referring to two articles in the main namespace with profanity. I see that John Stephenson has asterisked one of them already (The Lord of the Rings), but the other one is Luigi Meneghello.
I think the important point is that both of these articles have had profanity in them for a long time-- the LOTR article since it was written in December, and the Meneghello article since July 2007. Beyond that, though, even though these articles contain (or contained) profanity, it's still not clear that these articles were not "family friendly." At present, we have a slogan, not a policy. If vulgar language will get us filtered completely in schools and homes (as some have maintained) we need to have a clear policy that is actually enforced. Brian P. Long 17:40, 3 May 2008 (CDT)

One thing that concerns me is that, particularly in an international environment, we could wind up in a 'no-win' situation'; one group will be offended if we do put something in, and another will be offended if we leave it out. So you wind up offending someone no matter what you do... J. Noel Chiappa 10:34, 13 May 2008 (CDT)

Implementation Issues

On my day off, I've been poking around the 8e6 website (at http://www.xstop.com), to find out, in more detail, just how we might help users of filtering products work with Citizendium. The quick and dirty solution would have been just to put tags on our pages that match the categories filtering products use. To some extent, it seems like this might be possible, but it does not look like it will be straightforward.

Looking at the list of categories of material from the xstop website (at http://www.8e6.com/database-categories.html), it's clear that the categories they use end up being very broad. If we stick with their categories, it seems like the best solution would be to tag all of our questionable material as 'R-Rated'. It's worth noting that this does not seem to have any relation to the content in R-rated movies-- on balance, it probably comes out being something closer to PG-13. On balance, though, PG-13 is probably about the worst Citizendium articles are going to get (this means: some profanity, some nudity, moderate violence).

On the other hand, though, if we do start getting content that strains the limits of the R-Rated tag, we could always tag these articles with xstop's 'Pornography/Adult Content' tag-- this would be the nuclear option, so to speak. My feeling is that xstop's R-Rated category is actually pretty broad, and would be a passable fit for most of the offensive content CZ might have.

Another option would be to implement something akin to Google's safesearch. xstop's products, and presumably those of other companies as well, can force safesearch on for users behind the filter. We could do something similar, but a lot more implementation/coding would be involved. On the other hand, this could be far more powerful and useful. We could also use R-Rated tagging as a stopgap measure, and then implement something better when we have the time. If no tech people volunteer, I will assume that the programming option is off the table. Thanks, Brian P. Long 13:06, 3 May 2008 (CDT)

I just checked it out, and Net Nanny seems to be set up in much the same way as xstop's products. It too has categories for both R-Rated material ('Adult/Mature') and pornography. Net Nanny also has dynamic filtering for language, so that parents can decide in advance what vocabulary their children will and will not be exposed to on the internet. Thanks, Brian P. Long 16:20, 3 May 2008 (CDT)
Are these the same categories as XStop, or are they different? Do they use the same tags, or what? (And some details on the tags would be good.) If they are different, I forsee a whole lot of work ahead of us if we try and support N different companies' products. Maybe we could invite people from the various organizations to join the project, and tag the pages themselves? J. Noel Chiappa 10:34, 13 May 2008 (CDT)

I have also been corresponding with folks at NetNanny and 8e6 about how we might best proceed if we wanted to change our family-friendly policy. They have confirmed that profanity will pose a problem for filtering products. As I understand it, "dynamic analysis" has become the industry standard, and will block pages with profanity if that is how the filter is configured.

Another thing that has become apparent is that there already are a number of ways to categorize content so that it can be caught by internet filtering software. If we implement one of these, the filtering products should have no difficulty filtering out objectionable content.

At this point that I sit down in the road and ask for help. There are at least three content tagging systems: PICS, ICRA and Safesurf. In the next little while I would like to flesh out the implementation section of the proposal, but I would not mind having some help in evaluating which of these tagging systems would fit best with our set-up.

Safesurf is at http://www.safesurf.com/classify/ . PICS is at http://www.w3.org/PICS/ , and ICRA is at http://www.fosi.org/icra/ . Let me know if you have any comments about these specific content tagging systems. Thanks, Brian P. Long 17:45, 8 May 2008 (CDT)

When you say that "'dynamic analysis'... will block pages with profanity", does that just mean it will block individual pages, or the whole site? If the former, I think we could live with that; we just let everyone here know that, and they can make the choice. Perhaps we could mandate that such articles have a 'simplified' subpage version which contains no profanity? No, that won't work - they couldn't get to it, because the article's main page wouldn't show. So the main page of the article would have to be the 'sanitized' version, and the Advanced subpage could contain profanity. J. Noel Chiappa 10:34, 13 May 2008 (CDT)
My understanding is that dynamic analysis will block individual pages with profanity, while allowing access to the rest of the site. I was rather surprised to find that students at the school I usually work at (a middle school) are allowed access to Wikipedia, but are prevented from seeing objectionable content by what I assume is dynamic analysis. I don't know how they sort out offensive pictures on WP, but I could probably find out (I'm pretty sure that they use the 8e6 R3000, though). I haven't sorted out just what I think we should do about allowing access to pages with profanity-- if the profanity is integral to the fair treatment of the subject, I don't see how bleeping out the profanity will necessarily make the article any more family-friendly. Brian P. Long 20:11, 13 May 2008 (CDT)

Experience with balancing knowledge need vs. controversial presentation

We have a health sciences group, which makes me think of when I've developed -- or more correctly, rewritten things that didn't work -- for content filtering in healthcare facilities. When the filters prevent a surgeon from reading an article on the organ on which one does mastectomy, that surgeon is not amused. When urologists and gynecologists get legitimate search requests blocked because they refer to some body part that could have an erotic connotation, they were upset. The mental health people were even more likely to need to get information on things that might be considered less than "family-friendly", since they needed to understand what a patient was viewing.

In this case, the solution was to have different levels of filtering by user type, but rely on auditing as our main safeguard. That is much more difficult to achieve here. One possibility, however, might be to restrict access to controversial material to users with accounts. Obviously, that adds technical complexity and server load, and might deter the casual user.

When Masters and Johnson started on their research in sexuality, their choice of language and writing style for their first book, Human Sexual Response, was quite deliberate. It went beyond the "dull encyclopedic" into "journalspeak", avoiding colloquial language which, if in an online article, might trigger content filters.

Wikipedia does not object to describing things as "kinky", and I might myself do so in conversation, but an article is far less likely to hit filters with "paraphilia". At least in the sexual area, author/editors might be well advised to accept less general accessibility of the material, to avoid hints that can be sensationalized.

Wikipedia also likes graphics, which indeed can be illustrative, but again can be as or more problematic. It may be better to refer to external links for illustrations that are not absolutely essential to the content. Howard C. Berkowitz 11:42, 13 May 2008 (CDT)

I hear what you're saying, Howard, but my feeling is that we should move away from trying to self-censor to accommodate everyone (which I feel would be impossible). Content filtering in healthcare facilities is an aspect of the situation which I hadn't thought of, but which further illustrates the difficulty of anticipating the concerns of every kind of user. A sensible tagging policy, though, would allow healthcare facilities and other businesses to filter content they find objectionable and allow other content through. (I would be curious, though, just what products healthcare facilities use to filter their internet. Perhaps the 8e6 R3000...)
My understanding is that the most popular filtering products on the market today would likely not filter the entire en.citizendium.org domain for using the word "kinky". They might filter the individual pages, or all pages with the word "kinky", but they might filter all of the pages with the word "paraphilia" as well. Playing this kind of guessing game is counterproductive.
I think it bears repeating that human sexuality is by far the trickiest area for which to implement any kind of programme of self-censorship. The only way I can see would be to spell out, in advance, just what human sexuality articles are family-friendly and what are not. Assuming that we could draw up such a list, though, means all interested Citizendium contributors would have to come to some consensus on just what is family-friendly and what is not, despite stark differences in sexual mores and personal outlook. We should follow the guidelines of the American Library Association, and move away from self-censorship. Brian P. Long 20:01, 13 May 2008 (CDT)
Years ago, when I worked for the Library of Congress, I was an American Library Association member, until a very noisy fringe group published a position paper that literacy tests for librarians was discriminatory. So, yes, the ALA can do some very good things, but may I suggest that the concerns of a librarian are somewhat different than those of a content producer?
Now, I recognize that content filtering is, by no means, an utterly reliable technology. I'm reminded of a time when my postings to a computer networking forum kept getting filtered, on articles having to do with the operation of the Border Gateway Protocol, which shares information about the routing policies of Autonomous Systems, often abbreviated AS. It turned out that I was writing things that dealt with the concept of more than one autonomous system, so, not thinking of any possible consequences, I did what many writers do -- I did questionable things to an abbreviation, treating it as a word. In this case, I created the plural of AS, by adding the suffix -ses. In retrospect, -es probably would have kept me out of trouble.
There is a difference between self-censorship, and writing in a form appropriate for the particular venue. I have written erotica. I have written medical articles. I have written medical articles on reducing risk in certain...ummm...paraphiliac activities. Each of those three took a different writing style; the last was challenging because I needed to convey, to a not-necessarily-kinky healthcare professional, the medically significant nature of the act in which the patient took part. For the record, there is at least one directory of what are called "kink-aware professionals" (KAP), who are not only nonjudgmental, but don't need extensive explanations to give advice on minimizing risk of what they know someone is going to do.
While I have no idea what workgroup editors' would have responsibility, I am quite confident that I could work with an author that wanted to discuss paraphilias in a non-titillating way, and produce something that didn't lose essential meaning. That, to me, is helping an author, not censoring. In some cases, that adds precision; if I was describing the mode of action for certain medications about which we are often spammed, but have legitimate medical applications, it might be appropriate, in a pharmacology article, to speak of sildenafil's inhibition of type 5 phosphodiesterase in the corpus cavernosa and corpus spongiosum. In a less technical article, I might say that drug increases the effect of the body's mechanism for filling erectile tissue with blood. The first is probably family-confusing, the second neutral, but I could also describe the effect of PDE-5 mediated release of nitric oxide in a manner that might cause some families to attempt to exorcise their screen.
I don't believe I'm speaking of censorship, but assisting in describing human actions in accurate terminology. Yes, I agree that some people will consider certain behaviors, no matter how described as corrupt, sinful, and to be banished. My personal attitude is that adults have the freedom to choose a wide range of things, "as long as they don't do it in the street and scare the horses." You can never please everyone, but you can help authors stay focused on the key message. Howard C. Berkowitz 20:44, 13 May 2008 (CDT)
I want to make it clear that I'm not advocating writing encyclopedia articles in colorful language that may be less precise. My issue is that I think our current policy is unclear, and that, upon further reflection, I think it could be broadly improved upon. At the same time, though, I think the ALA guidelines are relevant in that I don't think it is our business to judge for school districts, parents, and healthcare facilities what is and what is not appropriate for them and their charges. Brian P. Long 21:18, 13 May 2008 (CDT)

Filtering software is not something to be relied upon if your goal is maximum readership. I run excellent filtering software at home and the rankest of the rankest Wikpedia articles are not filtered. My oldest son's public high school - the entire school district - runs filtering software. But it has also blocked Wikipedia totally because of lack of self-censorship there.

Look, every traditional print general encyclopedia has a "family friendly" policy. This is to maximize utility to the most possible. Citizendium should be no different. Topics like sexual reproduction can most certainly be dealt with in a scholarly fashion, and not diminish information, and be "family friendly".

My own feeling is that this policy will naturally evolve in the context of concrete cases, in reference to the foundational "family friendly" policy.

Stephen Ewen 21:34, 14 May 2008 (CDT)

Proposals System Navigation (advanced users only)

Proposal lists (some planned pages are still blank):