Do you remember telling your algebra teacher that formulas would never matter in your life? It seems that we owe her an apology.
These days, almost everything you interact with is guided by computer algorithms, which are a bit more complicated than the high school algebra equivalent. These formulas make decisions that affect our lives every day, such as what buildings are visited by fire inspectors, our matchups on dating sites and what information shows up on a Facebook feed or Google search.
“Algorithms are starting to make decisions for us in all walks of life,” said Nick Diakopoulos at the “Algorithms are the New Gatekeepers” session Friday at the Online News Association 2014 conference.
The session focused on the potential impact of algorithms on democracy through their influence over the spread of information. Diakopoulos cited research showing that the amount of news shown to users on Facebook affects their interest in politics and even the likelihood that they will vote.
Despite the widespread influence of these algorithms, very little is known about how they operate. That may be about to change.
Panelist Kelly McBride of the Poynter Institute said that algorithms can be polarizing. They play to the fears of conspiracy theorists and prompt people who are naturally trusting to defend their behavior.
“Until you get burned by an algorithm, you’re going to be completely trusting,” McBride said in an interview.
Some publishers have been feeling burned by Facebook’s algorithm lately. Several people asked Liz Heron, head of news partnerships at Facebook, to explain changes in the performance of their organization’s content on Facebook at the “Mobile/Social Mashup” session Friday. They expressed frustration and dissatisfaction with her answers.
Facebook is too vague about its algorithm and what works when you DON’T pay to boost pages/posts. Not ideal. #mobilemashup #ONA14 — Brian J. Manzullo (@BrianManzullo) September 25, 2014
My take: journalists want to know more about FB algorithm bc we’re investigative by nature and don’t like secrecy. #mobilemashup #ONA14
— Gabe Travers (@gabetrav) September 25, 2014
Facebook recently announced changes to how it selects which posts appear in a user’s feed, which it shared on media.fb.com. But the site does not have specific explanations of what content will be successful in the newsfeed because it is supposedly personalized to each user’s preferences.
This conversation bubbled up earlier this year when it became public that Facebook altered stories on some users’ feeds as part of a study on how emotions can be affected by newsfeed content.
Some people have said this creates an imbalance of power between the way that readers access content – through search engines or social networks – and content producers that rely on those platforms for traffic. New York University journalism professor Jay Rosen said that “Facebook has all the power” in an interview published in The Atlantic in July.
McBride said companies such as Google and Facebook don’t have a legal or moral obligation to release details about their algorithm unless they are concerned with the success of democracy and the democratic spread of information. She said these companies haven’t had a public conversation about who they have an obligation to serve, so it seems their primary duty is to stockholders.
But there is some evidence that these conversations are occurring. A representative from Google approached McBride to talk after the session. He told McBride he was interested in working together to be more transparent about Google’s algorithm – but couldn’t make any sort of public comment about it.
Facebook lists its mission statement on its investor relations website.
Founded in 2004, Facebook’s mission is to give people the power to share and make the world more open and connected. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.
The company also lists 1o principles, many of which relate to transparency and the free flow of information.
Google’s “10 things we know to be true” also focuses on universal access to information, but it also states that the “PageRank Algorithm” is used to sort the millions of pieces of content posted on the Internet.
One of the problems in addressing algorithms is the lack of public literacy. More than 62 percent of Internet users are unaware of algorithms, according to a presentation by Karrie Karahalios, an associate professor of computer science at the University of Illinois.
McBride said there is a need to raise awareness around these algorithms and help people understand how the information they view is manipulated through the Internet.
“Definitions are really important, and part of literacy is understanding how these words are functioning and making sure that these words … mean the same thing that all the citizens think they mean,” McBride said. “Obviously, ‘most recent’ doesn’t mean the same thing to Facebook that it means to me.”
What Facebook considers “news” is not always in sync with what the media considers “news,” admits @lheron #mobilemashup
— Tory Starr (@torystarr3) September 25, 2014
Considerations for Journalists
Because reporters and traditional media organizations are no longer gatekeepers to information, researcher Axel Bruns, a professor at Queensland University of Technology, writes that journalists have transitioned to “gatewatchers” responsible for policing the flow of information.
“Reporters have to go outside the algorithm to do their jobs well because otherwise they’re becoming a slave to the algorithm,” McBride said, emphasizing the importance of finding information outside the Internet to complement online research.
Investigative reporting, or “broccoli journalism,” was one example of content that is important but not often rewarded or elevated by algorithms. McBride brought up an example of an investigative story by a Chicago Tribune reporter about flame retardants.
“Nothing about the algorithm would have suggested that that is a story that she should have trained her energies on,” she said. But it was still in the public interest and worthy of the effort.
In another presentation, “10 tech trends in journalism,” Amy Webb said that filters restrict things but can also benefit journalists. She said we have to balance the problems of the “filter bubble” with the fact that there’s a lot of content that algorithms help us navigate.
The app Nuzzel, for example, has a “news from my friends” section where you can see who a user is being influenced by.
You can see what I’m talking about at http://t.co/92iEYN2wvL #ONA2014
— Amy Webb (@webbmedia) September 27, 2014
Webb showed a beta function of Nuzzel, news curation. A list of journalists covering unrest in Ferguson, Missouri, for example, shows what they’re sharing and reading about the situation. Algorithms and bots can also be used to translate information from public agencies into a news format to be updated by reporters.
Diakopoulos said media organizations should investigate the algorithm on their own, though the methodology and the law have not quite caught up. Some organizations have started to do this, such as when a reporter from The Atlantic sought to trick Facebook’s algorithm with fake posts.
One audience member, Julie Posetti, brought up what could be a double standard in this conversation. If journalists want companies such as Facebook and Google to reveal their algorithms, media organizations will need to share their formulas as well. The panelists said they have seen increasing interest in these issues and think we are on the cusp of a wave of efforts to increase awareness. McBride said she would like to see more investment in research of algorithms, to hold companies accountable. A summary and recording of the algorithm session is available on the ONA website.
Webb’s presentation is also available.
You can download this presentation here: http://t.co/X02HSPqMkA #ONA2014
— Amy Webb (@webbmedia) September 27, 2014
You must be logged in to post a comment.