Digital platforms have fundamentally changed the way we communicate, express and inform ourselves. This requires new rules to safeguard democratic values. As the Digital Services Act (DSA) awaits adoption by the EU, Natali Helberger, Alexandra Borchardt and Cristian Vaccari explain here how the Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” can complement the implementation of the DSA, which aims to update rules governing digital services in the EU. All three were members of the Council’s expert committee that was set up for this purpose, working in 2020 and 2021.
When Elon Musk announced his original plan to buy Twitter and, in his words, restore freedom of speech on the platform, EC Commissioner Thierry Breton quickly reminded him of the Digital Services Act (DSA). According to the DSA, providers of what it defines as ‘Very Large Online Platforms’ will have to ‘pay due regard to freedom of expression and information, including media freedom and pluralism.’ They will have to monitor their recommendation and content moderation algorithms for any systemic risks to the fundamental rights and values that constitute Europe. A video of Musk and Breton in Austin, Texas, shows Musk eagerly nodding and assuring Breton that “this all is very well aligned with what we are planning.”
But what exactly is well aligned here? What does it mean for social media platforms, such as Twitter, to pay due regard to freedom of expression, media freedom and pluralism? While the DSA enshrines a firm commitment to freedom of expression, it only provides limited concrete guidance on what freedom of expression means in a platform context. So when Musk was nodding along like an eager schoolboy, whilst his intentions may have been sincere there is also a realistic chance that he had no concrete idea of what exactly he was agreeing to.
The Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” provides some much-needed guidance.
The leading fundamental rights organisation in Europe
The Council of Europe is the largest international fundamental rights organisation in Europe. Distinct from the European Union, the Council’s EU member states and 20 more European states develop joint visions on European values and fundamental freedoms, as enshrined in the European Convention of Human Rights and interpreted by the European Court of Justice. Article 10 of the ECHRdefines freedom of expression as “the freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”
European media laws and policies have been significantly shaped by the Conventions, recommendations and guidelines of the Council. One of the most recent expert committees of the Council was tasked with preparing a recommendation on the impacts of digital technologies on freedom of expression, as well as guidelineson best practices for content moderation by internet intermediaries. The guidelines are already described here and here. In this post, the rapporteurs and chair of the Committee briefly summarise the key takeaways from the recommendation (for a full list of experts involved in the making of the recommendation, please see here). In so doing, we will explain the guidelines and address the question of how they complement and add to the recently agreed on DSA.
A value-based approach
The recommendation lays down principles to ensure that “digital technologies serve rather than curtail” freedom of expression and develops proposals to address the adverse impacts and enhance the positive effects of digital technology on freedom of expression. Here we note a first difference with the DSA. The DSA takes a risk-based approach: for example, Art. 26 requires Very Large Online Platforms to identify the risks and dangers that their recommendation and content moderation algorithms pose for fundamental rights and society. As such it focuses on the negative implications of technology.
In contrast, the Council of Europe Recommendation takes a value-based approach. It first clarifies that these technologies have an essential, positive role in a democracy by opening up the public sphere to more and diverse voices. According to the Council, the “digital infrastructures of communication in a democratic society” need to be designed “to promote human rights, openness, interoperability, transparency, and fair competition”. This value-based approach to digital technology acknowledges the need to mitigate risks, but goes one step further and demands that states, companies, and civil society actors work together to realize technology’s positive contribution to democracy and fundamental rights. It is vital to notice this difference, as both a risk-based and value- and opportunity-based approach will set the agenda for research and innovation.
Digital infrastructure design and the creation of counter-power
Where the DSA takes an application or tool-based approach, the recommendation adopts a broader media ecology perspective. The DSA addresses algorithmic content moderation, news recommenders and curation first and foremost as related to specific digital tools and applications. The recommendation takes a different approach and acknowledges that all those digital tools and applications together form the wider digital communication infrastructure that democracies rely on. According to the recommendation, these digital communication infrastructures should be designed to proactively promote human rights, openness, accessibility, interoperability, transparency and fair competition.
One key recommendation that arises from this media ecology view of digital technology is for states to proactively invest in and create the conditions to enhance economic competition and democratic pluralism in and on digital infrastructures. Other key recommendations include stimulating the digital transformation of news organisations, promoting open-source software, and investing in public service media. The recommendation also explicitly stresses the essential democratic role of local and regional media and the need to tackle concentration in terms of both economic dominance and, crucially, the power to shape public opinion. The recently adopted Council of Europe recommendation on creating a favourable environment for quality journalism complements the document and provides more detail in this particular area.
Transparency, accountability and redress as a joint responsibility of states and internet intermediaries
Transparency and explainability are essential in both the recommendation and the DSA. Like the DSA, the recommendation requires internet intermediaries to provide adequate transparency on the design and implementation of their terms of service and their key policies for content moderation, such as information regarding removal, recommendation, amplification, promotion, downranking, monetisation, and distribution, particularly concerning their outcomes for freedom of expression. The recommendation highlights that such information must ensure transparency on different levels and with different goals, including empowering users, enabling third-party auditing and oversight, and informing independent efforts to counter harmful content online. In other words, transparency is a multi-faceted and multi-player concept.
Having said that, whereas the DSA places the burden of providing transparency in the first place on platforms, the Council of Europe’s recommendation also ascribes responsibility to states and regulators. It advocates that states and regulators “should ensure that all necessary data are generated and published to enable any analysis necessary to guarantee meaningful transparency on how internet intermediaries’ policies and their implementation affect freedom of expression among the general public and vulnerable subjects.” States should also “assist private actors and civil society organisations in the development of independent institutional mechanisms that ensure impartial and comprehensive verification of the completeness and accuracy of data made available by internet intermediaries.” This approach complements the DSA in at least two respects: it assigns states a responsibility to ensure the accessibility and usability of such information, and it supports the development of independent systems of quality control (rather than relying exclusively on the mechanisms of Art. 31 DSA).
The extensive transparency mechanisms must be seen in the context of the recommendations on contestability. Transparency can be a value in itself, but as a regulatory tool, transparency obligations are primarily intended to empower subjects to take action. Consequently, the recommendation includes an obligation for states to ensure that any person whose freedom of expression is limited due to restrictions imposed by internet intermediaries must be able to seek timely and effective redress. Interestingly, the recommendation also extends this right to the news media: news providers whose editorial freedom is threatened due to terms of service or content moderation policies must be able to seek timely and effective redress mechanisms, too.
Actionable and empowering media literacy
The Council of Europe has a long tradition of supporting and developing media literacy policies, and this recommendation is no exception. The recommendation promotes data and digital literacy to help users understand the conditions under which digital technologies affect freedom of expression, how information of varying quality is procured, distributed and processed and, importantly, what individuals can do to protect their rights. As in other domains, the recommendation stresses the positive role that states can play. States should enable users to engage in informational self-determination and exercise greater control over the data they generate, the inferences derived from such data, and the content they can access. Although it is undeniable that the complexity of digital information environments places a higher burden on citizens to select, filter, and evaluate the content they encounter, the recommendation aims to promote processes and practices that reduce this burden by enhancing user empowerment and control.
Independent research for evidence-based rulemaking
In current regulatory proposals, there is a growing recognition of the role that independent research must play. Among other things, research can help to:
- identify (systemic) risks to fundamental rights, society and democracy as a result of the use of algorithmic tools,
- monitor compliance with the rules and responsibilities that pertain to those using those tools,
- develop insights on how to design technologies, institutions and governance frameworks to promote and realise fundamental rights and public values.
There is also growing recognition of the responsibility of states and platforms to create the conditions for independent researchers to be able to play such important role. The provisions in Art. 31 of the DSA on access to research data are an example of this new awareness.
The CoE recommendation, too, emphasises and requires that internet intermediaries must enable researchers to access the kinds of high-quality data that are necessary to investigate the individual and societal impacts of digital technologies on fundamental rights. The recommendation goes one step further than the DSA, however, and also emphasises the broader conditions that need to be fulfilled for independent researchers to play such a role. Besides calling for states to provide adequate funding for such research, the recommendation stresses the need to create secure environments that facilitate secure data access and analysis, as well as measures to protect the independence of researchers.
It is worth noting that the recommendation also suggests a new, more general research exception: that data lawfully collected for other purposes by internet intermediaries may be processed to conduct rigorous and independent research under the conditions that such research is developed with the goal of safeguarding substantial public interest in understanding and governing the implications of digital technologies for human rights. Such a research exception goes beyond the scope of Art. 31 DSA and addresses the problem that data access could be restricted because the internet intermediaries’ terms of use and privacy policies users agree to often fail to include explicit derogations for re-use of the data for research.
Conclusions
In sum, the Council of Europe’s recommendation offers a new vision of what it means to safeguard and at the same time expand freedom of expression in the digital age. There is a fine line between regulating speech and making sure that everyone gets a voice. The recommendation offers several actionable suggestions concerning the design of digital communication infrastructures, transparency and accountability, user awareness and empowerment, and support for the societal role of independent research. As such, the guidelines can be an essential resource for policymakers, civil society, academics, and internet intermediaries such as Google, Meta, Twitter or TikTok.
The latter companies are confronted with a challenging problem: prominent and ambitious regulatory proposals such as the DSA will require internet intermediaries to understand and account for the human rights implications of their technologies, even though they are not the classical addressees of human rights law. Fundamental rights, such as the right to freedom of expression, at least in Europe, apply in the first place to the relationship between states and citizens. Mandating that private actors such as internet intermediaries pay due regard to abstract rights such as the right to freedom of expression raises a host of difficult interpretational questions. More generally, the current European Commission’s focus on requiring the application of digital technology in line with fundamental rights and European values is laudable. Still, there is only limited expertise on how to interpret and implement fundamental rights law in the European Union, which started as, and still is primarily, an economic community. The Council of Europe’s recommendations and guidelines have an important complementary role to play in clarifying what respect for fundamental rights entails in the digital age and suggesting concrete actions to realise this vision.
This article, first published on 14th September 2022 , reflects the views of the authors and not those of the Media@LSE blog nor of the London School of Economics and Political Science.
OXFORD – Depending on where you get your news, your view of how the impeachment inquiry into US President Donald Trump is unfolding may be very different from that of your friends, relatives, or neighbors. You may also think that any version of the story that conflicts with yours is simply untrue. This lack of consensus on basic facts – largely a byproduct of social media – carries serious risks, and not nearly enough is being done to address it.
In recent years, the need to improve “media literacy” has become a favorite exhortation of those seeking to combat misinformation in the digital age, especially those who would prefer to do so without tightening regulation of tech giants like Facebook and Google. If people had enough media savvy, the logic goes, they would be able to separate the wheat from the chaff, and quality journalism would prevail.
There is some truth to this. Just as it is dangerous to drive in a place where you don’t know the traffic laws, navigating the new digital-media environment safely – avoiding not only “fake news,” but also threats like online harassment, nonconsensual (“revenge”) porn, and hate speech – requires knowledge and awareness. Robust efforts to improve media literacy globally are thus crucial. Free, credible, and independent news media are a pillar of any functioning democracy, essential to enable voters to make informed decisions and to hold elected leaders accountable. Given this, media literacy must be pursued within a broader campaign to improve democratic literacy.
Since its invention in ancient Greece more than 2,500 years ago, democracy has depended on rules and institutions that strike a balance between participation and power. If the point was simply to enable everyone to speak up, platforms like Facebook and Twitter would be the pinnacle of democracy, and popular movements like the 2011 Arab Spring would naturally produce functioning governments.
Instead, the objective is to create a system of governance in which elected leaders bring to bear their knowledge and experience, in order to advance the interests of the people. The rule of law and the separation of powers, guaranteed by a system of checks and balances, are vital to the functioning of such a system. In short, mobilization means little without institutionalization.
And yet, today, public institutions are suffering from the same lack of trust as news media. To some extent, this is warranted: many governments have failed to meet their citizens’ needs, and corruption is rampant. This has fueled rising skepticism toward democratic institutions, with people often preferring ostensibly more egalitarian online platforms, where everyone’s voice can be heard.
The problem is that such platforms lack the checks and balances that informed decision-making demands. And, contrary to the early expectations of some Internet pioneers, those checks and balances will not emerge organically. On the contrary, tech companies’ algorithm-driven business models all but preclude them, because they amplify voices according to clicks and likes, not value or veracity.
Populist politicians have taken advantage of the lack of checks and balances to obtain power, which they often use to please their supporters, ignoring the needs of opponents or minority groups. This type of majority rule looks a lot like mob rule, with populist leaders trying to overrule legislatures and courts to fulfill the desires – often shaped by lies and propaganda – of their constituents. British Prime Minister Boris Johnson’s recent attempt to suspend Parliament, in order minimize its ability to prevent a no-deal Brexit, is a case in point.
In a democracy, all people must be able to trust their leaders to uphold their rights and protect their basic interests, regardless of whom they voted for. They should be able to go about their daily lives, confident that public officials will dedicate their time and energy to making informed decisions – and that those who don’t will be checked and balanced by the rest. Credible independent media support this process.
In Johnson’s case, the judiciary fulfilled its duty to check the executive. But with every assault on democratic institutions, accountability is weakened, people become more disillusioned, and the legitimacy of the system declines. Over time, this reduces the incentive for talented people to work in fields like journalism and politics, eroding their effectiveness and legitimacy further.
Breaking this vicious circle requires the rapid expansion of media and democratic literacy, including how the system works and who owns and shapes it. And yet, as a forthcoming study by the Council of Europe’s Expert Committee on Quality Journalism in the Digital Age (on which I served) shows, most existing media-literacy programs are limited to teaching schoolchildren how to use digital platforms and understand news content. Very few target older people (who are most in need), explain who controls media and digital infrastructure, or teach the mechanisms of algorithmic choice.
Democracies all over the world are enduring a stress test. If they are to pass, their institutional underpinnings must be reinforced. That requires, first and foremost, an understanding of what those underpinnings are, why they matter, and who is trying to dismantle them.
This commentary was published by Project Syndicate on November 28, 2019