Facebook could make nine “incremental” changes to ensure it becomes a better forum for free speech and democracy, according to a new report by academics at the University of Oxford in the UK and Stanford University in the US.
Proposals include: an external appeals body; more user control over News Feeds; and better content review and fact-check mechanisms.
The growing influence of Facebook – as well as other platforms such as Instagram, YouTube and Twitter – in the personal, cultural and political life of billions of people has led to widespread concerns about the influence of hate speech, harassment, extremist content, polarisation, disinformation and covert political advertising, the report, Glasnost! Nine Ways Facebook Can Make Itself a Better Forum for Free Speech and Democracy, argues.
Amidst calls for government regulation, Facebook has recently begun working to regain the trust of the public, politicians and regulatory authorities, largely through greater transparency. The platform is also consulting widely with researchers, journalists, policy-makers and civic society activists.
The report, which the authors describe as part of a process of “constructive engagement” with the technology company, identifies specific issues concerning political information and political speech, provides an overview of the major changes Facebook has made in recent years, and offers nine recommendations as to what more it should do:
The report’s nine recommendations for Facebook:
1. Tighten Community Standards wording on hate speech
2. Hire more and contextually expert content reviewers
3. Increase ‘decisional transparency’
4. Expand and improve the appeals process
5. Provide meaningful News Feed controls for users
6. Expand context and fact-checking facilities
7. Establish regular auditing mechanisms
8. Create an external content policy advisory group
9. Establish an external appeals body
Lead author, Timothy Garton Ash, building on the analysis in his recent book Free Speech: Ten Principles for a Connected World, concludes that while industry-wide self-regulation should be actively pursued, attaining it will be a “long and complex task”.
“In the meantime, the best should not be the enemy of the good. As we have indicated in this report, there is a great deal that a platform like Facebook can do right now to address widespread public concerns, and to do more to honour its public interest responsibilities as well as international human rights norms. Executive decisions made by Facebook have major political, social, and cultural consequences around the world. A single small change to the News Feed algorithm, or to content policy, can have an impact that is both faster and wider than that of any single piece of national (or even EU-wide) legislation,” Garton Ash writes.
The report was prepared by the Free Speech Debate project of the Dahrendorf Programme for the Study of Freedom, St. Antony’s College, Oxford, in partnership with the Reuters Institute for the Study of Journalism, University of Oxford, the Project on Democracy and the Internet, Stanford University, and the Hoover Institution, Stanford University.