We hear this phrase thrown about all the time: “Don’t believe everything you see on the internet.” Yet the recent controversies concerning fake news on social media prove that this is neither well practiced nor completely in our control. According to a study by Buzzfeed News, in the last three months of the 2016 election, people engaged more with fake election news than credible articles from New York Times, Washington Post and other established sources. Psychological research of a phenomenon called the “illusory truth affect” displays that exposure to a piece of false information multiple times makes people more likely to believe it is true. With this combination of information, fake news proves to be a real problem for social media and its users.
Recently, Facebook has received criticism for the prevalence of fake news trending on its site. False articles such as “The Pope endorses Trump” and “Hillary Clinton bought $137 million in illegal arms” gained hundreds of thousands of views during the election, leading to accusations that this fake news played a part in Trump’s victory. Falsehoods are spread throughout the internet, yet Facebook’s huge audience of 1.79 billion monthly active users around the world makes the frequent fake news a concern. Facebook C.E.O. Mark Zuckerberg denied that this situation was to blame for the election, yet he later wrote that the social media site takes “misinformation seriously”. There has been news and progress concerning misinformation since then, yet Zuckerberg was correct in pointing out the philosophical complexities of the situation. The possible solutions still pose problems.
Surveys asking Facebook users to identify the false articles are already being implemented as part of the solution to fight fake news. The surveys ask users either “To what extent do you think that this link’s title withholds key details of the story?” or “To what extent do you think this link’s title uses misleading language?” While the details of the uses of this data are not specifically stated, it is certain that this is part of the effort to address misinformation. A clear problem with these surveys is the fact that users answering them are the same ones who did not recognize them as false during the election period. Another possibility posed with users regulating content is the possibility that people or groups of people may flag articles in order to censor ideas or news that they do not want spread.
Facebook has also met with media and entertainment companies with the goal of a partnership on a new creation called, “Collections.” Reportedly, this would be a feature on Facebook similar to Snapchat’s “Discover.” Its goal would be to curate verified news for the website. This possible solution comes with problems as well. The most evident one is the fact that while this information will be available, whether users will utilize it or accept it remains unknown. The supposed inspiration, Snapchat’s “Discover,” is seen by over 150 million users daily, yet many of its articles are targeted toward youthful audiences interested in celebrities and tabloid news. Some of its sources are Buzzfeed, Cosmopolitan and MTV, though scrolling to the bottom you can see Wall Street Journal, CNN and National Geographic as well. Facebook has a much larger audience that includes a wider range of users, and its collection of sources would need to reflect that.
A scandal from earlier this year provides another problem presented with this project. Earlier this year, Facebook’s trending section was scrutinized as it was reported that its editors were purposely keeping conservative news from appearing on the list. “Collections” would have to be careful to present news from all sides in order to both be accepted and to avoid becoming the target of further criticism.
While these solutions have their problems, Facebook is actively addressing the issue of misinformation in a way that does not infringe on the freedom of speech that is central to the site. C.E.O. Mark Zuckerberg wrote in a post that the company believes in “erring on the side of letting people share what they want whenever possible,” which is an important goal to maintain in the fight against misinformation. Possible censorship is a greater evil than false news stories.
Alyssa Luis is a weekly columnist for The Daily Campus opinion section. She can be reached via email at firstname.lastname@example.org.