One of the things that the internet has created is a way for nearly anyone to say anything for very little money and make it look good. A complaint on social media in recent months has been about the "rampant spread of fake news" online. That term means different things to different people, and can range from absurdly inaccurate information to information that disagrees with the reader's preexisting beliefs on the subject.
Somehow, the blame for this "epidemic" has been placed on Facebook and Google's shoulders, despite their complete disinvolvement in the phenomenon. In fact, Google only indexes websites, indiscriminant of their content, and Facebook allows OTHER people to share the content they find. Neither company or service produces and content at all.
For better or worse, both companies have been forced to respond to a situation that could be solved by critical thinking and reading comprehension - something that this has proven most people are incapable of anymore. The two companies have responded to the scenario very differently, with both beginning their strategies this week.
Facebook seems to agree that the solution to the problem is through public education. The company has introduced its new program, the News Integrity Initiative, whose purpose is to help people spot nonsense when they read it online. This $14 million initiative is being backed not only by the social network, but The Ford Foundation, Mozilla Foundation and more.
This joint fund will not be used to create technology to remove the thought process from reading content published online. Instead, it will be used to help people learn how to think for themselves. Facebook and its partners do not want to eliminate the process of critical analysis from people's lives, as that will ultimately make the problem worse. This initiative is designed to enhance that aspect of people's lives. When you read content, you should be capable of drawing educated decisions based on what you have read - it should not be up to someone else to tell you the way it is.
Going the other way, Google has taken a different approach to the legitimacy of online content. The company has begun to label content that another organization has "fact checked" the content. This label is in addition to the existing system of labeling content as "In-Depth," "Opinion." "Blog" and "Local Source."
With this move, Google is placing a lot of trust in third parties. Google is placing the trust of its own brand in the hands of organizations like Snopes and PolitiFact. Both of these organizations have had their own difficult relationships with reality in the past. No one is infallible and everyone looks at the world through tinted glasses. Their view of reality might be tinted by their past experiences, political affiliation or religions beliefs. What is for sure is that no two people live in the same reality.
Following Facebook's path has a lot of potential for failure. People today generally dismiss bettering themselves as beneath them. We all know people who think that they are too good to learn anything new, because they already know what is going on. Having a social network try to teach them something new could come across as lecturing, and the overall goal could be a total loss.
Following Google's path is a dangerous one. Their news content could easily become politicized, rather than indiscriminant, as it is today. An opinion that differs from that of the editors of a fact checking organization could be marked as invalid, or a fact that disproves a held belief could be discounted. All of this could potentially damage the national dialog, as well as the value of Google's search results.