Digital platforms have fundamentally changed the way we communicate, express and inform ourselves. This requires new rules to safeguard democratic values. As the Digital Services Act (DSA) awaits adoption by the EU, Natali Helberger, Alexandra Borchardt and Cristian Vaccari explain here how the Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” can complement the implementation of the DSA, which aims to update rules governing digital services in the EU. All three were members of the Council’s expert committee that was set up for this purpose, working in 2020 and 2021.
When Elon Musk announced his original plan to buy Twitter and, in his words, restore freedom of speech on the platform, EC Commissioner Thierry Breton quickly reminded him of the Digital Services Act (DSA). According to the DSA, providers of what it defines as ‘Very Large Online Platforms’ will have to ‘pay due regard to freedom of expression and information, including media freedom and pluralism.’ They will have to monitor their recommendation and content moderation algorithms for any systemic risks to the fundamental rights and values that constitute Europe. A video of Musk and Breton in Austin, Texas, shows Musk eagerly nodding and assuring Breton that “this all is very well aligned with what we are planning.”
But what exactly is well aligned here? What does it mean for social media platforms, such as Twitter, to pay due regard to freedom of expression, media freedom and pluralism? While the DSA enshrines a firm commitment to freedom of expression, it only provides limited concrete guidance on what freedom of expression means in a platform context. So when Musk was nodding along like an eager schoolboy, whilst his intentions may have been sincere there is also a realistic chance that he had no concrete idea of what exactly he was agreeing to.
The Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” provides some much-needed guidance.
The leading fundamental rights organisation in Europe
The Council of Europe is the largest international fundamental rights organisation in Europe. Distinct from the European Union, the Council’s EU member states and 20 more European states develop joint visions on European values and fundamental freedoms, as enshrined in the European Convention of Human Rights and interpreted by the European Court of Justice. Article 10 of the ECHRdefines freedom of expression as “the freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”
European media laws and policies have been significantly shaped by the Conventions, recommendations and guidelines of the Council. One of the most recent expert committees of the Council was tasked with preparing a recommendation on the impacts of digital technologies on freedom of expression, as well as guidelineson best practices for content moderation by internet intermediaries. The guidelines are already described here and here. In this post, the rapporteurs and chair of the Committee briefly summarise the key takeaways from the recommendation (for a full list of experts involved in the making of the recommendation, please see here). In so doing, we will explain the guidelines and address the question of how they complement and add to the recently agreed on DSA.
A value-based approach
The recommendation lays down principles to ensure that “digital technologies serve rather than curtail” freedom of expression and develops proposals to address the adverse impacts and enhance the positive effects of digital technology on freedom of expression. Here we note a first difference with the DSA. The DSA takes a risk-based approach: for example, Art. 26 requires Very Large Online Platforms to identify the risks and dangers that their recommendation and content moderation algorithms pose for fundamental rights and society. As such it focuses on the negative implications of technology.
In contrast, the Council of Europe Recommendation takes a value-based approach. It first clarifies that these technologies have an essential, positive role in a democracy by opening up the public sphere to more and diverse voices. According to the Council, the “digital infrastructures of communication in a democratic society” need to be designed “to promote human rights, openness, interoperability, transparency, and fair competition”. This value-based approach to digital technology acknowledges the need to mitigate risks, but goes one step further and demands that states, companies, and civil society actors work together to realize technology’s positive contribution to democracy and fundamental rights. It is vital to notice this difference, as both a risk-based and value- and opportunity-based approach will set the agenda for research and innovation.
Digital infrastructure design and the creation of counter-power
Where the DSA takes an application or tool-based approach, the recommendation adopts a broader media ecology perspective. The DSA addresses algorithmic content moderation, news recommenders and curation first and foremost as related to specific digital tools and applications. The recommendation takes a different approach and acknowledges that all those digital tools and applications together form the wider digital communication infrastructure that democracies rely on. According to the recommendation, these digital communication infrastructures should be designed to proactively promote human rights, openness, accessibility, interoperability, transparency and fair competition.
One key recommendation that arises from this media ecology view of digital technology is for states to proactively invest in and create the conditions to enhance economic competition and democratic pluralism in and on digital infrastructures. Other key recommendations include stimulating the digital transformation of news organisations, promoting open-source software, and investing in public service media. The recommendation also explicitly stresses the essential democratic role of local and regional media and the need to tackle concentration in terms of both economic dominance and, crucially, the power to shape public opinion. The recently adopted Council of Europe recommendation on creating a favourable environment for quality journalism complements the document and provides more detail in this particular area.
Transparency, accountability and redress as a joint responsibility of states and internet intermediaries
Transparency and explainability are essential in both the recommendation and the DSA. Like the DSA, the recommendation requires internet intermediaries to provide adequate transparency on the design and implementation of their terms of service and their key policies for content moderation, such as information regarding removal, recommendation, amplification, promotion, downranking, monetisation, and distribution, particularly concerning their outcomes for freedom of expression. The recommendation highlights that such information must ensure transparency on different levels and with different goals, including empowering users, enabling third-party auditing and oversight, and informing independent efforts to counter harmful content online. In other words, transparency is a multi-faceted and multi-player concept.
Having said that, whereas the DSA places the burden of providing transparency in the first place on platforms, the Council of Europe’s recommendation also ascribes responsibility to states and regulators. It advocates that states and regulators “should ensure that all necessary data are generated and published to enable any analysis necessary to guarantee meaningful transparency on how internet intermediaries’ policies and their implementation affect freedom of expression among the general public and vulnerable subjects.” States should also “assist private actors and civil society organisations in the development of independent institutional mechanisms that ensure impartial and comprehensive verification of the completeness and accuracy of data made available by internet intermediaries.” This approach complements the DSA in at least two respects: it assigns states a responsibility to ensure the accessibility and usability of such information, and it supports the development of independent systems of quality control (rather than relying exclusively on the mechanisms of Art. 31 DSA).
The extensive transparency mechanisms must be seen in the context of the recommendations on contestability. Transparency can be a value in itself, but as a regulatory tool, transparency obligations are primarily intended to empower subjects to take action. Consequently, the recommendation includes an obligation for states to ensure that any person whose freedom of expression is limited due to restrictions imposed by internet intermediaries must be able to seek timely and effective redress. Interestingly, the recommendation also extends this right to the news media: news providers whose editorial freedom is threatened due to terms of service or content moderation policies must be able to seek timely and effective redress mechanisms, too.
Actionable and empowering media literacy
The Council of Europe has a long tradition of supporting and developing media literacy policies, and this recommendation is no exception. The recommendation promotes data and digital literacy to help users understand the conditions under which digital technologies affect freedom of expression, how information of varying quality is procured, distributed and processed and, importantly, what individuals can do to protect their rights. As in other domains, the recommendation stresses the positive role that states can play. States should enable users to engage in informational self-determination and exercise greater control over the data they generate, the inferences derived from such data, and the content they can access. Although it is undeniable that the complexity of digital information environments places a higher burden on citizens to select, filter, and evaluate the content they encounter, the recommendation aims to promote processes and practices that reduce this burden by enhancing user empowerment and control.
Independent research for evidence-based rulemaking
In current regulatory proposals, there is a growing recognition of the role that independent research must play. Among other things, research can help to:
- identify (systemic) risks to fundamental rights, society and democracy as a result of the use of algorithmic tools,
- monitor compliance with the rules and responsibilities that pertain to those using those tools,
- develop insights on how to design technologies, institutions and governance frameworks to promote and realise fundamental rights and public values.
There is also growing recognition of the responsibility of states and platforms to create the conditions for independent researchers to be able to play such important role. The provisions in Art. 31 of the DSA on access to research data are an example of this new awareness.
The CoE recommendation, too, emphasises and requires that internet intermediaries must enable researchers to access the kinds of high-quality data that are necessary to investigate the individual and societal impacts of digital technologies on fundamental rights. The recommendation goes one step further than the DSA, however, and also emphasises the broader conditions that need to be fulfilled for independent researchers to play such a role. Besides calling for states to provide adequate funding for such research, the recommendation stresses the need to create secure environments that facilitate secure data access and analysis, as well as measures to protect the independence of researchers.
In sum, the Council of Europe’s recommendation offers a new vision of what it means to safeguard and at the same time expand freedom of expression in the digital age. There is a fine line between regulating speech and making sure that everyone gets a voice. The recommendation offers several actionable suggestions concerning the design of digital communication infrastructures, transparency and accountability, user awareness and empowerment, and support for the societal role of independent research. As such, the guidelines can be an essential resource for policymakers, civil society, academics, and internet intermediaries such as Google, Meta, Twitter or TikTok.
The latter companies are confronted with a challenging problem: prominent and ambitious regulatory proposals such as the DSA will require internet intermediaries to understand and account for the human rights implications of their technologies, even though they are not the classical addressees of human rights law. Fundamental rights, such as the right to freedom of expression, at least in Europe, apply in the first place to the relationship between states and citizens. Mandating that private actors such as internet intermediaries pay due regard to abstract rights such as the right to freedom of expression raises a host of difficult interpretational questions. More generally, the current European Commission’s focus on requiring the application of digital technology in line with fundamental rights and European values is laudable. Still, there is only limited expertise on how to interpret and implement fundamental rights law in the European Union, which started as, and still is primarily, an economic community. The Council of Europe’s recommendations and guidelines have an important complementary role to play in clarifying what respect for fundamental rights entails in the digital age and suggesting concrete actions to realise this vision.
This article, first published on 14th September 2022 , reflects the views of the authors and not those of the Media@LSE blog nor of the London School of Economics and Political Science.