EU Expert Group on Fake News Releases Final Report
On Mar. 12, the European Commission released the final report of its independent High Level Expert Group on Fake News and Online Disinformation (the Group), a group of 39 experts from different sectors and countries that was convened earlier this year and tasked with putting forward strategies to counter disinformation. The report, which is one part of the Commission’s efforts to formulate a new strategy to tackle disinformation more effectively, calls for a “multi-dimensional approach to disinformation” and emphasizes that there is no single root cause or solution to such a complicated problem.
The report is a dense document. While it is bold in some ways, calling for immediate action with ambitious deadlines, in other ways it is cautious, emphasizing the need for more comprehensive research and evidence about the disinformation problem and potential solutions. This tension is not surprising. The Expert Group that authored it was only convened in mid-January and consisted of 39 members from different backgrounds including academia, journalism and media, online platforms and civil society. This necessitated compromise, and indeed the Chair of the group explicitly thanked members in the Foreword for “setting aside their own predilections” for the collective good. As four members of the group said in a separate blog post, “the report should be read as a compromise document.”
However, beyond these disclaimers, the report is significant and important for three reasons. First, it is an urgent call to arms, underlining the seriousness of the threat of disinformation and the desperate need for more information about it. Second, it sets out a concrete plan of action with a pressing timetable, in which the role of states and regulators is narrowly confined. Finally, and perhaps most importantly, it is evidence that stakeholders are willing to share data and collaborate on strategies to combat the issue. Collaboration of this nature will enable the transparency and informed research needed to formulate effective solutions to this opaque and complex problem.
Call to Arms
The report emphasizes that the threat of disinformation goes well beyond “fake news.” In fact, it advocates for the abandonment of the term because it has been appropriated by powerful actors to dismiss coverage that they find disagreeable. The broader phenomenon of disinformation “represents risks for our democratic processes, national security, social fabric, and can undermine trust in the information society.” This more comprehensive and complex threat includes all forms of false, inaccurate or misleading information designed either to cause public harm or generate profit. The report speaks mostly in generalities but gives a few examples, including malicious fabrications, infiltration of grassroots groups and the use of automated amplification techniques. It also notes that this can take place in a variety of sectors and not just politics, such as disinformation in health, science, education and finance debates. The Group acknowledged that this content might not be illegal, but that it can nonetheless harm democratic values and processes, including elections.
The report states that the increasingly digital environment has enabled an increase in the volume of various kinds of disinformation and that the role of digital media, particularly large US-based platform companies, in this phenomenon is “important but not yet well understood.” The growing power of these companies (such as Facebook, Google and Twitter - all of whom had representatives as members of the Group) to enable or interfere with the free circulation of information comes with growing responsibilities.
The report’s approach to tackling the issue rests on five pillars:
Enhanced transparency about online news;
Greater citizen media and information literacy;
Empowering users and journalists with tools they can use to flag and avoid disinformation;
Promoting a diverse European news ecosystem; and
Plan of Action
The Group advises the Commission to “disregard simplistic solutions.” It includes in this any attempts to censor its way out of the problem, either by direct censorship or “the politically dictated privatization of the policing and censorship of what is and is not acceptable forms of expression.” The report dedicates a separate section to the legal mandates that require respect for freedom of expression. Responses should instead focus on increasing resilience of citizens to disinformation. Focusing on their role in providing funding and support for other initiatives, the report envisions a somewhat secondary role for governments. In other words, regulation is to be avoided, “as government or EU regulation of disinformation can be a blunt and risky instrument.” Countries such as the U.K. and Germany may take this as a rebuke; Germany, has recently enacted a broad law targeting problematic online content, the U.K., is currently conducting an inquiry into fake news and considering legislation, and other member states, such as France, have made similar noises.
The report instead invites the Commission to promote a general, European-wide Code of Practices for countering disinformation, that would set out clear rules for each stakeholder group. These stakeholders include government, media, fact-checkers and civil society. But the “key principles” that the report suggests should be embodied in the Code are primarily directed at the technology companies and online platforms, not government or other members of the news ecosystem. These principles focus on how platforms can ensure transparency and accountability, while empowering users, media, civil society and researchers.
The report suggests that a Coalition of stakeholders be formed immediately to elaborate the Code, with a view to it being implemented by Jan. 1, 2019. This Coalition would be “facilitated” by the Commission, but the report suggests it will be a voluntary association of stakeholders. Its effectiveness should be continually monitored, including through a dedicated review in Mar. 2019, ahead of European Parliament elections. Alongside this, the report calls for the creation of a European Centre of Excellence to manage a network of national research centres to enable the creation and dissemination of research into the problem and effectiveness of solutions. Also critical, says the report, is increased funding to support the news media ecosystem and media and information literacy initiatives. The report notes that in the U.S., the State Department has been given $120 million to counter foreign efforts to meddle in elections or sow distrust in democracy, and hopes that the European Commission and member states will “make at least a similar level of financial commitment.” (The report does not mention that the State Department has not spent any of this funding.)
Commitments by Stakeholders
The theme throughout the report is the need for an evidence-based collaborative approach that includes continual monitoring of the problem and the systematic evaluation of the efficacy of attempted solutions. This is only possible if there is access to data about the prevalence and spread of disinformation to help better grasp the scope and nature of online disinformation and how it spreads, and to develop potential solutions such as “source transparency indicators.” As such, the report contains repeated calls for greater data-sharing by platforms.
Frustration with the lack of transparency of the online information environment is not new, and researchers have long been calling for greater access to social media companies’ data, including greater transparency over how their algorithms work to identify and promote certain content. As the report says, the online ecosystem is “opaque - sometimes by design.” What is notable about this document, then, is that representatives of some of the most important holders of this data - including Facebook, Google and Twitter - have signed off on the calls for greater collaboration and data-sharing. As the four Group members who authored a separate blog post noted, this is “particularly significant” because the companies have “now taken a public commitment to work with researchers who can independently assess the spread and impact of disinformation.” The commitment, of course, lacks specificity at this stage. But if it does lead to the kind of transparency that researchers have long been calling for, this would be a step in the right direction.
The Group’s report will feed into a European Commission Communication on the issue of fake news and disinformation that is expected by late April. The Communication will also take into account a broad public consultation and survey. Critical questions will be whether the Commission embraces the proposed self-regulatory model outlined in the report, and the extent to which the tech companies follow through on their commitment to greater data-sharing. At the very least, the creation of the Group and the impressive commitment shown in generating the report on such a tight turnaround is yet another sign of how seriously the European community is taking the threat of digital disinformation.