Free speech in the digital age – a constructive approach

Digital platforms have fundamentally changed the way we communicate, express and inform ourselves. This requires new rules to safeguard democratic values. As the Digital Services Act (DSA) awaits adoption by the EU, Natali Helberger, Alexandra Borchardt and Cristian Vaccari explain here how the Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” can complement the implementation of the DSA, which aims to update rules governing digital services in the EU. All three were members of the Council’s expert committee that was set up for this purpose, working in 2020 and 2021.

When Elon Musk announced his original plan to buy Twitter and, in his words, restore freedom of speech on the platform, EC Commissioner Thierry Breton quickly reminded him of the Digital Services Act (DSA). According to the DSA, providers of what it defines as ‘Very Large Online Platforms’ will have to ‘pay due regard to freedom of expression and information, including media freedom and pluralism.’ They will have to monitor their recommendation and content moderation algorithms for any systemic risks to the fundamental rights and values that constitute Europe. A video of Musk and Breton in Austin, Texas, shows Musk eagerly nodding and assuring Breton that “this all is very well aligned with what we are planning.”

But what exactly is well aligned here? What does it mean for social media platforms, such as Twitter, to pay due regard to freedom of expression, media freedom and pluralism? While the DSA enshrines a firm commitment to freedom of expression, it only provides limited concrete guidance on what freedom of expression means in a platform context. So when Musk was nodding along like an eager schoolboy, whilst his intentions may have been sincere there is also a realistic chance that he had no concrete idea of what exactly he was agreeing to.

The Council of Europe’s recently adopted recommendation “on the impact of digital technologies on freedom of expression” provides some much-needed guidance.

The leading fundamental rights organisation in Europe

The Council of Europe is the largest international fundamental rights organisation in Europe. Distinct from the European Union, the Council’s EU member states and 20 more European states develop joint visions on European values and fundamental freedoms, as enshrined in the European Convention of Human Rights and interpreted by the European Court of Justice. Article 10 of the ECHRdefines freedom of expression as “the freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”

European media laws and policies have been significantly shaped by the Conventions, recommendations and guidelines of the Council. One of the most recent expert committees of the Council was tasked with preparing a recommendation on the impacts of digital technologies on freedom of expression, as well as guidelineson best practices for content moderation by internet intermediaries. The guidelines are already described here and here. In this post, the rapporteurs and chair of the Committee briefly summarise the key takeaways from the recommendation (for a full list of experts involved in the making of the recommendation, please see here). In so doing, we will explain the guidelines and address the question of how they complement and add to the recently agreed on DSA.

A value-based approach

The recommendation lays down principles to ensure that “digital technologies serve rather than curtail” freedom of expression and develops proposals to address the adverse impacts and enhance the positive effects of digital technology on freedom of expression. Here we note a first difference with the DSA. The DSA takes a risk-based approach: for example, Art. 26 requires Very Large Online Platforms to identify the risks and dangers that their recommendation and content moderation algorithms pose for fundamental rights and society. As such it focuses on the negative implications of technology.

In contrast, the Council of Europe Recommendation takes a value-based approach. It first clarifies that these technologies have an essential, positive role in a democracy by opening up the public sphere to more and diverse voices. According to the Council, the “digital infrastructures of communication in a democratic society” need to be designed “to promote human rights, openness, interoperability, transparency, and fair competition”. This value-based approach to digital technology acknowledges the need to mitigate risks, but goes one step further and demands that states, companies, and civil society actors work together to realize technology’s positive contribution to democracy and fundamental rights. It is vital to notice this difference, as both a risk-based and value- and opportunity-based approach will set the agenda for research and innovation.

Digital infrastructure design and the creation of counter-power

Where the DSA takes an application or tool-based approach, the recommendation adopts a broader media ecology perspective. The DSA addresses algorithmic content moderation, news recommenders and curation first and foremost as related to specific digital tools and applications. The recommendation takes a different approach and acknowledges that all those digital tools and applications together form the wider digital communication infrastructure that democracies rely on. According to the recommendation, these digital communication infrastructures should be designed to proactively promote human rights, openness, accessibility, interoperability, transparency and fair competition.

One key recommendation that arises from this media ecology view of digital technology is for states to proactively invest in and create the conditions to enhance economic competition and democratic pluralism in and on digital infrastructures. Other key recommendations include stimulating the digital transformation of news organisations, promoting open-source software, and investing in public service media. The recommendation also explicitly stresses the essential democratic role of local and regional media and the need to tackle concentration in terms of both economic dominance and, crucially, the power to shape public opinion. The recently adopted  Council of Europe recommendation on creating a favourable environment for quality journalism complements the document and provides more detail in this particular area.

Transparency, accountability and redress as a joint responsibility of states and internet intermediaries

Transparency and explainability are essential in both the recommendation and the DSA. Like the DSA, the recommendation requires internet intermediaries to provide adequate transparency on the design and implementation of their terms of service and their key policies for content moderation, such as information regarding removal, recommendation, amplification, promotion, downranking, monetisation, and distribution, particularly concerning their outcomes for freedom of expression. The recommendation highlights that such information must ensure transparency on different levels and with different goals, including empowering users, enabling third-party auditing and oversight, and informing independent efforts to counter harmful content online. In other words, transparency is a multi-faceted and multi-player concept.

Having said that, whereas the DSA places the burden of providing transparency in the first place on platforms, the Council of Europe’s recommendation also ascribes responsibility to states and regulators. It advocates that states and regulators “should ensure that all necessary data are generated and published to enable any analysis necessary to guarantee meaningful transparency on how internet intermediaries’ policies and their implementation affect freedom of expression among the general public and vulnerable subjects.” States should also “assist private actors and civil society organisations in the development of independent institutional mechanisms that ensure impartial and comprehensive verification of the completeness and accuracy of data made available by internet intermediaries.” This approach complements the DSA in at least two respects: it assigns states a responsibility to ensure the accessibility and usability of such information, and it supports the development of independent systems of quality control (rather than relying exclusively on the mechanisms of Art. 31 DSA).

The extensive transparency mechanisms must be seen in the context of the recommendations on contestability. Transparency can be a value in itself, but as a regulatory tool, transparency obligations are primarily intended to empower subjects to take action. Consequently, the recommendation includes an obligation for states to ensure that any person whose freedom of expression is limited due to restrictions imposed by internet intermediaries must be able to seek timely and effective redress. Interestingly, the recommendation also extends this right to the news media: news providers whose editorial freedom is threatened due to terms of service or content moderation policies must be able to seek timely and effective redress mechanisms, too.

Actionable and empowering media literacy

The Council of Europe has a long tradition of supporting and developing media literacy policies, and this recommendation is no exception. The recommendation promotes data and digital literacy to help users understand the conditions under which digital technologies affect freedom of expression, how information of varying quality is procured, distributed and processed and, importantly, what individuals can do to protect their rights. As in other domains, the recommendation stresses the positive role that states can play. States should enable users to engage in informational self-determination and exercise greater control over the data they generate, the inferences derived from such data, and the content they can access. Although it is undeniable that the complexity of digital information environments places a higher burden on citizens to select, filter, and evaluate the content they encounter, the recommendation aims to promote processes and practices that reduce this burden by enhancing user empowerment and control.

Independent research for evidence-based rulemaking

In current regulatory proposals, there is a growing recognition of the role that independent research must play. Among other things, research can help to:

  • identify (systemic) risks to fundamental rights, society and democracy as a result of the use of algorithmic tools,
  • monitor compliance with the rules and responsibilities that pertain to those using those tools,
  • develop insights on how to design technologies, institutions and governance frameworks to promote and realise fundamental rights and public values.

There is also growing recognition of the responsibility of states and platforms to create the conditions for independent researchers to be able to play such important role. The provisions in Art. 31 of the DSA on access to research data are an example of this new awareness.

The CoE recommendation, too, emphasises and requires that internet intermediaries must enable researchers to access the kinds of high-quality data that are necessary to investigate the individual and societal impacts of digital technologies on fundamental rights.  The recommendation goes one step further than the DSA, however, and  also emphasises the broader conditions that need to be fulfilled for independent researchers to play such a role. Besides calling for states to provide adequate funding for such research, the recommendation stresses the need to create secure environments that facilitate secure data access and analysis, as well as measures to protect the independence of researchers.

It is worth noting that the recommendation also suggests a new, more general research exception: that data lawfully collected for other purposes by internet intermediaries may be processed to conduct rigorous and independent research under the conditions that such research is developed with the goal of safeguarding substantial public interest in understanding and governing the implications of digital technologies for human rights. Such a research exception goes beyond the scope of Art. 31 DSA and addresses the problem that data access could be restricted because the internet intermediaries’ terms of use and privacy policies users agree to often fail to include explicit derogations for re-use of the data for research.

Conclusions

In sum, the Council of Europe’s recommendation offers a new vision of what it means to safeguard and at the same time expand freedom of expression in the digital age. There is a fine line between regulating speech and making sure that everyone gets a voice. The recommendation offers several actionable suggestions concerning the design of digital communication infrastructures, transparency and accountability, user awareness and empowerment, and support for the societal role of independent research. As such, the guidelines can be an essential resource for policymakers, civil society, academics, and internet intermediaries such as Google, Meta, Twitter or TikTok.

The latter companies are confronted with a challenging problem: prominent and ambitious regulatory proposals such as the DSA will require internet intermediaries to understand and account for the human rights implications of their technologies, even though they are not the classical addressees of human rights law. Fundamental rights, such as the right to freedom of expression, at least in Europe, apply in the first place to the relationship between states and citizens. Mandating that private actors such as internet intermediaries pay due regard to abstract rights such as the right to freedom of expression raises a host of difficult interpretational questions. More generally, the current European Commission’s focus on requiring the application of digital technology in line with fundamental rights and European values is laudable. Still, there is only limited expertise on how to interpret and implement fundamental rights law in the European Union, which started as, and still is primarily, an economic community. The Council of Europe’s recommendations and guidelines have an important complementary role to play in clarifying what respect for fundamental rights entails in the digital age and suggesting concrete actions to realise this vision.

This article, first published on 14th September 2022 , reflects the views of the authors and not those of the Media@LSE blog nor of the London School of Economics and Political Science.

What’s wrong with the News?

The rise of data analytics has made journalists and their editors confident that they know what the people want. Why, then, did almost one-third of respondents to the Reuters Institute’s latest Digital News Report say that they regularly avoid news altogether?

The British public can’t get enough news about Brexit – at least, that’s what news platforms’ data analytics say. But, according to the Reuters Institute’s latest Digital News Report, 71% of the British public tries to avoid media coverage of the United Kingdom’s impending departure from the European Union. This disparity, which can be seen in a wide range of areas, raises serious questions about news organizations’ increasingly data-driven approach to reporting.

The rise of data analytics has made journalists and their editors confident that they know what people want. And for good reason: with a large share of news consumed on the Internet, media platforms know exactly which stories readers open, how much they read before getting bored, what they share with their friends, and the type of content that entices them to sign up for a subscription.

Such data indicate, for example, that audiences are interested in extraordinary investigative journalism, diet and personal-finance advice, and essays about relationships and family. They prefer stories with a personal angle – say, detailing an affected individual’s fate – rather than reports on ongoing conflicts in the Middle East or city hall coverage. And they are drawn to sensational stories – such as about US President Donald Trump’s scandals and antics – under “clickbait” headlines.

But if newsrooms were really giving audiences what they wanted, it seems unlikely that almost one-third (32%) of respondents in the Digital News Report, the world’s largest ongoing survey of online news consumption, would report that they regularly avoid news altogether. But they did, and that figure is up three percentage points from two years ago.

The most common explanation for avoiding the news media, given by 58% of those who do, is that following it has a negative effect on their mood. Many respondents also cited a sense of powerlessness.

Moreover, only 16% of participants approve of the tone used in news coverage, while 39% disapprove. Young people, in particular, seem fed up with the negativity bias that has long been regarded as a sure-fire way to attract audiences. For many, that bias feels disempowering. Conversations indicate that the problem is compounded for young parents, who want to believe that the world will be good to their children. Younger generations also feel consuming news should be more entertaining and less of a chore.

One reason for the disconnect between the data and people’s self-reported relationship with the news media may be the “guilty pleasure” effect: people have an appetite for voyeurism, but would prefer not to admit it, sometimes even to themselves. So, even as they click on articles about grisly crimes or celebrity divorces, they may say that they want more “quality news.”

 

When newsrooms indulge readers’ worst impulses, the consequences are far-reaching. Media are integral to support accountability by anyone wielding power or influence, and to mobilize civic engagement. Democracies, in particular, depend on voters being well informed about pressing issues. News organizations thus have a responsibility to report on serious topics, from political corruption to climate change, even if they are unpleasant.

That does not mean that readers’ complaints about media’s negativity bias should be disregarded. On the contrary, if people are to be motivated to confront challenges that are shaping their lives, they should not be made to feel powerless.

This is where so-called solutions journalism comes in. By balancing information about what needs changing with true stories about positive change, news organizations can fulfill their responsibility both to inform and to spur progress. This means occasionally recognizing that over the long term, living standards have improved globally.

Reconnecting with audiences will also require media organizations to broaden their perspectives. In much of the West, it is largely white, male, middle-class journalists who decide what to cover and how. This limits news media’s ability to represent diverse societies fairly and accurately.

In fact, only 29% of Digital News Report respondents agreed that the topics the news media choose “feel relevant” to them. A joint study by the Reuters Institute and the Johannes Gutenberg University in Mainz, Germany, indicates that the key to increasing this share is to increase diversity in newsrooms.

At the same time, news media need to do a better job of contextualizing and otherwise explaining the news. While 62% of Digital News Report respondents feel that media keep them apprised of events, only half believe news outlets are doing enough to help them understand what is happening. At a time when nearly one-third of people think that there is simply too much news being reported, the solution seems clear: do less, better.

This means listening to readers, not just studying the data analytics. It means balancing good news with bad news, and offering clarifying information when needed. It also means representing diverse perspectives. Media organizations that do not make these changes will continue to lose trust and relevance. That is hardly a sound strategy for convincing consumers that their work is worth paying for.

This commentary was published by Project Syndicate on September 11, 2019