Submit to Digest

Disinformation and ‘Fake News’: Final Report: UK Parliamentary Committee Releases a Report on Disinformation

Reports

Dɪɢɪᴛᴀʟ, Cᴜʟᴛᴜʀᴇ, Mᴇᴅɪᴀ ᴀɴᴅ Sᴘᴏʀᴛ Cᴏᴍᴍɪᴛᴛᴇᴇ, Dɪꜱɪɴꜰᴏʀᴍᴀᴛɪᴏɴ ᴀɴᴅ ‘ꜰᴀᴋᴇ ɴᴇᴡꜱ’: Fɪɴᴀʟ Rᴇᴘᴏʀᴛ, 2019-8, HC 1791 (UK).

On February 18, 2019, the UK Digital, Culture, Media and Sports Committee published the Disinformation and ‘fake news’: Final Report (“Report”). The Report “denounces” Facebook and other big tech companies as “‘digital gangsters’ in the online world” that are “ahead of and beyond the law,” and calls for greater constraint on and proper regulatory oversight of them.

Building upon the Interim Report published in July 2018, the Report focuses on main issues in seven areas, reiterates some of the recommendations from the Interim Report, and asks the Government to reconsider some of the recommendations to which it did not respond earlier and to make concrete plans in its forthcoming Online Harms White Paper.

First, the Report recommends formulating a new role, definition, and legal liability of tech companies. Under this new approach, tech companies can no longer hide behind their self-proclaimed function as a “platform.” Instead, they will assume the responsibilities in regulating the content of their sites. The Report also recommends creating an online regulatory system where tech companies need to follow a Code of Ethics to “act against agreed harmful and illegal content on their platform.” An independent regulator will have the power of launching legal proceedings against those who fail to meet their obligations.

Second, the Report criticizes Facebook’s data misuse and data targeting by citing to the evidence obtained from the Six4Three court documents. It claims that the Cambridge Analytica scandal would have been avoided but for Facebook’s data policies. It says that Facebook “was willing to override its users’ privacy settings in order to transfer data to some app developers” and “to starve some developers . . . of that data, thereby causing them to lose their business.” The Report calls for an investigation into Facebook’s involvement in any anti-competitive practices and into “whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail.”

Third, the Report suggests that there is a closer relationship between Cambridge Analytica, SCL Group, and Aggregate IQ (“AIQ”) than a merely contractual one. It calls out AIQ’s common practice of “work[ing] on both the US Presidential primaries and for Brexit-related organisations” to “target people . . . in order to influence their decisions.”

Fourth, in response to a move away from physical to online advertising and political campaign, the Report argues that the current electoral law “is not fit for purpose and needs to be changed to reflect changes in campaigning techniques.” By calling attention to organizations such as Mainstream Network that seek to influence political debate in their favor on Facebook and other social media, the Report urges the government to explore ways for the Electoral Commission to carry out comprehensive work, such as compelling organizations to provide information relevant to its inquiries.

Fifth, the Report asks the Government to address the issues of foreign influence in political campaigns, particularly of Russian interference in UK politics. It advocates a review of the current rules on overseas involvement in UK elections and a stronger stance of the Government on social media companies publicizing disinformation.

Sixth, the Report highlights SCL influence in foreign elections. It points out that SCL Group and associate companies have been involved in election and referendum campaigns in more than twenty countries. It thus urges the Government to “consider new regulations that curb bad behavior in this industry” and to “revisit the UK Bribery Act, to gauge whether the legislation is enough of a regulatory brake on bad behaviour abroad.”

Last, the Report stresses the importance of digital literacy as “a fourth pillar of education.” It argues that digital literacy enables the public to report misleading and unlawful digital campaigning, and to distinguish between quality journalism and disinformation.

In response to the report, Facebook said that it “share[s] the committee’s concerns about false news and election integrity,” “[is] open to meaningful regulation,” and “support[s] the committee’s recommendation for electoral law reform.” Last December, Facebook also defended itself by arguing that the Six4Three documents had been “selectively leaked” to tell “only one side of the story.”