Nieman Lab Prediction 2026: Editors will start tackling the 5% challenge – and it won’t be fun (at first)

The advances of generative AI have put those in charge of newsrooms on an emotional rollercoaster. While 2023 and 2024 were the years of reckless experimentation (“Hey, look what these models can do!”), in 2025, AI realism took over. Great ideas turned out to be hard to implement, costly, or solutions looking for problems (“Nice, but it’s not serving anyone!”). Putting strategy back into AI development became key.

This is why 2026 is likely to become the dip of the ride. Because now, the strategy needs to be filled with life. And while editors at media conferences widely agree that AI will force newsrooms to focus on unique, original journalism and experiences that create value for their audiences and deepen customer connections, some detailed data analysis will make many of them feel queasy. Because the result will often be not that different from what an editor recently revealed at an industry gathering: Only 5% of a subset of his brand’s content was original journalism. The subtext was clear, of course: The rest could have been done by an AI. Welcome to the 5% challenge.

Expect many newsroom leaders to become busy next year figuring out what exactly makes their brand stand out in the emerging sea of content. And even harder: finding a way to scale the 5% (or maybe 20%) to proportions that guarantee their journalism’s survival. Because let’s face it, the era of the web has been the age of copy-and-paste journalism. And this is exactly what (once) younger journalists have been raised to do in the past 20 years or so. Sitting behind the screen all day and competing for reach was the job. The word “reporting” — picking up stories from the streets by looking at things and talking to people, face-to-face or on the phone — was converted into the phrase “reporting on the ground,” which sounded as if leaving the comfort of the office was an award-worthy niche discipline.

For leaders, doing all of this will involve conveying some hard truths to many newsroom inhabitants: telling them that their daily work has to change — and fast. Converting agency copy into a snappy story — the AI has already done it. Doing some service journalism because customers safely clicked on it — the chatbot will have been there already. Upselling subscriptions with branded recipes — maybe, as long as ChatGPT still spoils the dish with hallucinations. Unfortunately, “stop doing” is among the hardest disciplines for any kind of enterprise. Because other than running exciting experiments and excelling in the innovation department, stopping routines and common practices is neither sexy nor does it bring about career advantages. To the contrary, it means robbing people of things they love to do, or are at least proficient in. And it takes away the status and power that was attached to practicing them. Speaking of rollercoasters, there will be some uncomfortable circles at the bottom of this.

There are four areas where media brands can scale the human-made part of their journalism

But here comes the uplifting part: Focusing one’s journalism on “the real thing” (again) will be fun — for seasoned hacks and creator-type newcomers alike. And it can also help bridge the newsroom generation gap. While younger colleagues can learn from the more experienced ones research and source-building skills for access and investigations (including persistence and picking up a phone), older ones will profit from everything that the Insta-and-Spotify generation can bring to the desk, like video, podcasting, data research, and brand-building competencies.

There are four areas in particular where media brands can scale the human-made part of their journalism: First, with strong personal brands who will play out their authenticity and humanness to connect with audiences (plenty has been published about news creators in 2025). Second, with deep expertise in niche areas that AI-generated content cannot provide because it is prone to converge around the average. Third, with investigations that make news consumers proud of “their” news brand. And fourth, with strong local journalism that is deeply rooted in its communities — in most cases, AI won’t go there. Creators who understand their formats and their stuff can figure in all of these areas, of course.

The sizable rest can safely be left to the workings of AI, where agents will do a much faster, more targeted, and personalized job than humans could have done, provided humans do the necessary prep work for accuracy. Markus Franz, chief technology officer of Munich-based Ippen Group, predicts that with agentic AI, the current “human in the loop” principle will be replaced with a “human on the loop” approach in the future that helps with scalability.

In all of these scenarios, journalism jobs will move into two quite different directions. One set of roles will lean toward the more techie side. They will need to shape the new AI-mediated world of journalism, ensure scalability that adheres to the quality standards of journalism, and build compelling products for customers that make them connect directly with the brand. On the other side, we will see the new “old-style” journalists who do everything to solicit exclusive information and/or establish themselves as personal brands. Talent will most likely have to pick sides early on, and it is essential that journalism education reflects and fosters this. As soon as everyone has settled into their new seats, the rollercoaster can go on its next climb.

This prediction was published with Harvard University’s Nieman Lab on December 16, 2025.

 

Anne Lagercrantz, SVT: “Journalism has to move up the value chain”

Anne Lagercrantz is the Director General of SVT Swedish Television. Alexandra talked to her about how generative AI has created more value for audiences, SVTs network of super users, and what will make journalism unique as opposed to automated content generation. 

Anne, many in the industry have high hopes that AI can do a lot to improve journalism, for example by making it more inclusive and appealing to broader audiences. Looking at SVT, do you see evidence for this?  

I can see some evidence in the creative workflows. We just won an award for our Verify Desk, which uses face recognition and geo positioning for verification.  

Then, of course, we provide automated subtitles and AI-driven content recommendations. In investigative journalism, we use synthetic voices to ensure anonymity.  

I don’t think we reach a broader audience. But it’s really being inclusive and engaging. 

In our interview for the 2024 report, you said AI hadn’t been transformative yet for SVT. What about one year later? 

We’re one step further towards the transformative. For example, when I look at kids’ content, we now use text to video tools that are good enough for real productions. We used AI tools to develop games then we built a whole show around it.  

So, we have transformative use cases but it hasn’t transformed our company yet.  

What would your vision be? 

Our vision is to use AI tools to create more value for the audience and to be more effective.  

However – and I hear this a lot from the industry – we’re increasing individual efficiency and creativity, but we’re not saving any money. Right now, everything is more expensive.  

Opinions are split on AI and creativity. Some say that the tools help people to be more creative, others say they are making users lazy. What are your observations?  

I think people are truly more creative. Take the Antiques Roadshow as an example, an international format that originated at the BBC.  

We’ve run it for 36 years. People present their antiques and have experts estimate their value. The producers used to work with still pictures but with AI support they can animate them.  

But again, it’s not the machine, it’s the human and the machine together.  

You were a newsroom leader for many, many years. What has helped to bring colleagues along and have them work with AI?  

I think we cracked the code. What we’ve done is, we created four small hubs: one for news, one for programmes, one for the back office and one for product. And the head of AI is holding it all together.  

The hubs consist of devoted experts who have designated time for coaching and experimenting with new tools. And then there’s a network of super users, we have 200 alone in the news department.  

It has been such a great experience to have colleagues learn from each other.  

It’s a top-down movement but bottom-up as well. We combine that with training, AI learning days with open demos. Everyone has access and possibility.  

We’ve tried to democratize learning. What has really helped to change attitudes and culture was when we created our own SVTGPT, a safe environment for people to play around in. 

What are the biggest conflicts about the usage of AI in the newsroom? 

The greatest friction is to have enthusiastic teams and co-workers who want to explore AI tools, but then there are no legal or financial frameworks in place.  

It’s like curiosity and enthusiasm meeting GDPR or privacy. And that’s difficult because we want people to explore, but we also want to do it in a safe manner. 

Would you say there’s too much regulation?  

No, I just think the AI is developing at a speed we’re not used to. And we need to find the time to have our legal and security department on board.  

Also, the market is flooded with new tools. And of course, some people want to try them all. But it’s not possible to assess fast that they’re safe enough. That’s when people feel limited. 

No one seems to be eager to talk about ethics any longer because everyone is so busy keeping up and afraid of missing the boat. 

Maybe we are in a good spot because we can experiment with animated kids’ content first. That’s different from experimenting with news where we are a lot more careful.  

Do you get audience reaction when using AI?  

There are some reactions, more curious than sceptical.  

What also helps is that the Swedish media industry has agreed upon AI transparency recommendations, saying that we will tell the audience that is AI when it has a substantial influence on the content. It could be confusing to label every tiny thing.  

Where do you see the future of journalism in the AI age now with reasoning models coming up and everyone thinking, oh, AI can do much of the news work that has been done by humans before? 

I’m certain that journalism has to move up in the value chain to investigation, verification and premium content.  

And we need to be better in providing context and accountability.  

Accountability is so valuable because it will become a rare commodity. If I want to contact Facebook or Instagram, it’s almost impossible. And how do you hold an algorithm accountable?  

But it is quite easy to reach an editor or reporter. We are close to home and accountable. Journalists will need to shift from being content creators and curators to meaning makers.  

We need to become more constructive and foster trust and optimism.  

Being an optimist is not always easy these days. Do you have fears in the face of the new AI world? 

Of course. One is that an overreliance on AI will lead to a decline in critical thinking and originality.  

We’re also super aware that there are a lot of hallucinations. Also, that misinformation could undermine public trust, and that it is difficult to balance innovation with an ethical AI governance.  

Another fear is that we are blinded by all the shiny new things and that we’re not looking at the big picture.  

What do you think is not talked about enough in the context of journalism and AI? 

We need to talk more about soft values: How are we as human beings affected by new technology?  

If we all stare at our own devices instead of looking at things together, we will see loneliness and isolation rise further.  

Someone recently said we used to talk about physical health then about mental health, and now we need to talk about social health, because you don’t ever need to meet anyone, you can just interact with your device. I think that’s super scary.  

And public service has such a meaningful role in sparking conversations, getting people together across generations.  

Another issue we need to talk more about is: if there is so much personalization and everyone has their own version of reality, what will we put in the archives? We need a shared record.

This interview was published by the EBU on 16th April as an appetizer for the EBU News Report “Leading Newsrooms in the Age of Generative AI”.