Online commenting
This section addresses these ethical issues:
- What is your purpose in providing for online comments?
- Do you consider it an extension of your news product?
- What rules should you impose?
News organizations are not required to host online comments, but they’re expected as part of their journalistic responsibility to provide a voice for the public in some form. In fact, in 1947, the U.S. Commission on Freedom of the Press, also known as the Hutchins Commission, said that a “forum for the exchange of comment and criticism” is one of five requirements “society is entitled to demand of its press.”
In modern times, the public often speaks electronically by liking, sharing and commenting. Unfortunately for many news organizations, commenting has become a journalistic nightmare. Will public comments be what The Washington Post called a “cesspool,” or the public good many envisioned when online comments became part of the news terrain in the 1990s?
The first thing for a news organization to decide is whether it wants to host online comments at all. Not all organizations do. If you do decide to allow comments, for what purpose are you doing so? Is it to provide a forum for discussion? To build community? To track public opinion? Give voice to the voiceless? Determine which posted stories are most popular? Grow your own audience? Determine your news agenda?
Most important, do you consider comment sections part of your news product — governed by the rules you apply to your own journalism, letters to the editor and opinion columns — or will you have a separate standard for commenters? The more you consider comments part of your news product, the more the comments section should reflect your journalistic ethics.
For instance, if you take a tough line on anonymous quotes in your news stories, you may well want to ban anonymous comments. As a study in a 2014 issue of Journalism Practice confirmed, anonymous comments tend to be less civil than comments where real names are required.
But that doesn’t mean all editors and publishers ban them, according to a 2014 Sounding Board survey by the Associated Press Media Editors association. It found 46 percent of the 101 survey participants allowed anonymous postings. Still, the majority (54 percent) required commenters to identify themselves and 38 percent required commenters to use their first and last names. For better or worse, some believe that banning anonymous posting cuts significantly into the number of people who will leave comments.
You also should consider what obscenities and vulgarities you will allow. Some organizations remove them all, sometimes through automation. Others allow them in cases where they’re used simply as a device for linguistic emphasis, rather than insult. Each organization needs to make its own decision.
However, few organizations allow hate speech and discriminatory language based on race, ethnicity, gender, sexual orientation, economic status, religion or culture. Such comments tear at the concept of civil discourse and diminish public confidence in the press which at 7 percent is at its lowest level ever (The Associated Press-NORC Center for Public Affairs Research, “Confidence in Institutions: Trends in Americans’ Attitudes Toward Government, Media, and Business”).
Once you’ve settled on your policies, it’s important to be forthright about your expectations for comments and commenter behavior. Make clear if there’s a line that commenters aren’t allowed to cross — based on your view of journalism’s social responsibility, society’s expectation of a forum for comment and criticism, your purpose for hosting online comments, and the harm to the news organization, journalism and the public good when commenters are allowed to cross the line.
If bullies and bigots take over the comment platform and use it for their personal playground, the public perception of the quality of your journalism will be threatened — even if none of your staff members is involved. One study showed uncivil comments polarize readers and change the way they understand a story. An active human moderator can manage troll behavior and encourage commenters to use a constructive tone that is consistent with your news organization’s vision for its online comment forum.
There’s a cost, though. While an active human moderator can help achieve your news organization’s vision for online comments, some editors participating in the APME survey expressed concern about the amount of staff time monitoring requires. Moderating online comments 24/7 was rare, with only 12 percent of editors reporting round-the-clock staff monitoring. Online comments were moderated by staff 13 to 16 hours a day among 27 percent of the respondents.
There’s another element of online commenting to consider: when readers post on your site, they are providing you with data. Depending on the registration forms and technology you use, you may be able to individually identify users and study their interest and browsing habits on your site and elsewhere. This requires thinking about what restrictions you will place on the use of this data.
Will you yield it up to law enforcement? Require a subpoena? Will you let your own editors see how users are browsing your site, either on an individually identifiable basis or in aggregate? You may not anticipate for now that anyone will want this data, but data access and privacy questions can come up very quickly.
Precisely because online comments are the forum that gives a voice to the modern-day public, they are important to the future of journalism in a democratic society. As a forum for the exchange of public comment and criticism, online comments are too important to be left without thought as to their purpose and management.
The main author of this section is Paula Poindexter, journalism professor at the University of Texas at Austin, and 2013-2014 president of the Association for Education in Journalism and Mass Communication (AEJMC).