AP Photo/Eric Risberg
Anita Varma is the program manager for Journalism & Media Ethics as well as Business, Leadership, and Social Sector Ethics at the Markkula Center for Applied Ethics. Views are her own.
have brought Facebook back under fire for not removing politically incendiary misinformation from its platform.
When pressed to justify why stuck to a consistent and ethically troubling talking point: it鈥檚 up to people to decide for themselves.
鈥淲hy keep it up?鈥 Cooper asked.
鈥淲e think it鈥檚 important for people to make their own informed choice about what to believe,鈥 Bickert replied.
Curiously, Bickert was insistent that Facebook was providing users with abundant notification that the video was false, yet also said that Facebook should not decide what is true or false. Doing so, she said with incredulity, would amount to asking 鈥渁 technology company to be the decider of what is true and what is false.鈥
Instead, Bickert asserted, Facebook鈥檚 role is to act as a conduit for independent fact-checkers 鈥渁nd then put that information in front of people so that they can make an informed choice.鈥
Bickert鈥檚 refrain that individuals should decide for themselves mirrors the questionable logic of . A neoliberal concept, responsibilization means With echoes of invocations of media literacy as a remedy for misinformation,
From the perspective of a company like Facebook, rhetoric of responsibilization provides an excellent escape hatch from the more gnarled question of whether Facebook鈥檚 reach and political content is harming society. Instead, their argument seems to be that it鈥檚 up to people to decide 鈥 in which case Facebook is not falling short of its responsibility but actually excelling by providing people with additional context to make decisions for themselves.
From the perspective of people living in a Facebook-fractured world, however, Facebook鈥檚 response sounds disconcertingly like a chemical company dumping hazardous waste in a residential neighborhood and then declaring it residents鈥 responsibility to move if they are bothered by getting cancer.
Abdicating responsibility for a problem is often an effective public relations tactic, but is a far cry from public accountability. Throughout the interview and elsewhere, Facebook鈥檚 representatives have persistently sidestepped the ethical question of whether political content that is verifiably false should be disseminated on a platform with billions of users. Facebook鈥檚 polished refusal to reckon with the ethical question of its role in misinformation is predictable, if frustrating, and their strategic use of responsibilization suggests that it is time to abandon any last remnants of hope that Facebook could be an instrument for social good.
In the absence of Facebook being willing to acknowledge and act upon its central role as far more than a common carrier or innocent conduit in the misinformation landscape, we need to recognize Facebook for what it is and seems to insist on remaining: a platform that fosters misinformation by design, festers hate, and placidly insists that this is not only fine but also a desirable state of affairs.
Moving from bewildering to bizarre, Bickert鈥檚 characterization of what kind of business Facebook is in provided a final nail in the coffin against viewing Facebook as fighting misinformation rather than proliferating it. Asked about Facebook鈥檚 responsibility as a news business, Bickert replied, 鈥淲e aren鈥檛 in the news business, we鈥檙e in the social media business.鈥
If Facebook is truly not in the news business, then its should be updated to prohibit the sharing of news content on the platform. Furthermore, if Facebook refuses to provide coherent justification for their actions (that do not depend on a laughable attempt to minimize their role), then it seems long past due that people 鈥 whose responsibility, it apparently is, to decide for themselves 鈥 get out of Facebook鈥檚 social media business before its societal, political, and cultural harms take us past the point of no return.