Close
  • Français
  • English

Contents: what are the responsibilities of social platforms and networks? *

28/11/2017

Social platforms and networks have been in the firing line over the past few months. The finger has been pointed at them for their responsibility in rumour spreading, “hoaxes” and other kinds of “fake news” punctuating the American and French election campaigns. They have also been deemed guilty of allowing “illicit” content to be posted. An American lawyer filing a suit for complicity in advocating terrorism has even accused them of allowing for the emergence of Daesh. The accusations and powers rightly or wrongly attributed to such platforms consider them to have an immense level of influence, going far beyond their scope. Judging by the figures provided to the US Congress on the estimated level of pro-Russian propaganda circulating during the last presidential campaign: 300,000 views on Youtube, 36,000 Twitter accounts automatically generating content, 126 million Americans potentially exposed via Facebook. Is it really, however, right to see these platforms as the source of all evil? Let’s remember that their flagship business model is based on advertising. They are thus highly specialised in optimising, contextualising and editing content… It is also important to recall that the “liberating power” of the Internet has been widely used and welcomed by the US authorities, especially at the time of the Arab Spring. By acting as fantastic sounding boards, these platforms do indeed contribute to the manufacturing of public opinion, to use the words of Noam Chomsky. Although their responsibility goes beyond that of simply passing on information, they should not, however, be seen as the new justices of the peace, tasked with determining what is good or bad for democracy, dangerous or harmless to our security. What needs to be done, on the one hand, is therefore to improve both the education of Internet users and the mechanisms that make it possible to detect manipulations of information (an estimated 48 million Twitter accounts are allegedly managed by bots…). On the other hand, there is an urgent need to strengthen the fight against “illegal content”, including terrorist propaganda, drawing upon current legislation, especially the French law of 13 November 2014 that sets out blocking, withdrawal and dereferencing procedures. Although human beings will inevitably need to remain in the loop, platforms are currently identifying new opportunities to draw upon artificial intelligence. From March 2016 to April 2017, 5,512 requests (60% of which were related to terrorism) were sent to such platforms via the administrative procedure. And 630,000 terrorist accounts have been removed by Twitter…

Florian Bachelier

Member of Parliament, 8th constituency of Ile-et-Vilaine

First Quaestor of the French National Assembly

*This will be the theme of the next FIC Agora to be held at Maison de la Chimie on 14 December 2017 from 8.30 AM to midday. This event will be organised under the distinguished patronage of the Committee on Digital and Posts.

 The FIC Agora aims to regularly bring together elected officials and stakeholders in the digital world for mini-sessions on cybersecurity issues. It complements the monthly FIC Observatory breakfasts.