Channel 4 CEO Alex Mahon has called for joint industry action and new regulation to ensure young people can find verified, independent news easily on social media. Her call follows Channel 4 research revealing the challenges facing Gen Z in an era where platforms have announced “a wanton abandonment of the pursuit of truth”.
Speaking at a joint Channel 4 and Royal Television Society event in London, Mahon revealed the findings of Channel 4’s study into 13-27-year-olds, which shows a generation grappling with the idea of truth and argued that the way Gen Z learn to judge fact, fiction and fairness as they grow older may be the defining issue of our age.
Mahon called for action because “global platforms are dominant” and have no legal requirement to take responsibility for what they publish; and “defenders of truth are always on the back foot” because “lying is more exciting and fiction travels faster than fact”. She argued that: “We must think about objective truth and validated news as a public good. We need to ensure they are present on new platforms, rather than see them as compensating for a market failure that we regulate for on the old platforms.”
Mahon suggested three solutions:
Trustmark: Introduce a trustmark as an indicator of factual, trusted accuracy for content that emerges from professionally produced, regulated media. This could allow tech companies, their algorithms, advertisers and consumers to distinguish instantly between what is checked and true and what is not.
Algorithmic prominence on social media: Regulate for public service media content to be prominent on social media platforms. This is already being implemented for PSMs on TV platforms and algorithmic prominence could use the same principle to ensure high-quality, trusted content is boosted – not throttled – on social platforms. Regulators should also explore mechanisms for a fair revenue share, ensuring PSMs are compensated for the value and engagement their content generates.
Regulation that supports PSM to shape AI: Train large language models (LLMs) using validated PSM content. The existing LLMs have been trained on the vast and variable global internet but outputs could be higher quality if the input included PSM content. Robust regulation should ensure transparency about what AI models are trained on, as well as secure fair value and compensation for data owners.
The research that informed Mahon’s speech was delivered by insight agency Craft based on a nationally representative sample of 3,000 people aged 13-65. The research revealed a number of developing issues, including:
Growing gender divergence: Nearly half of Gen Z men (45%) believe that “we have gone so far in promoting women’s equality that we are discriminating against men” and 44% think women’s equal rights have gone far enough.
Democratic disengagement: More than half (52%) think “the UK would be a better place if a strong leader was in charge who does not have to bother with parliament and elections” and one-third (33%) believe “the UK would be a be a better place if the army was in charge”.
Uncertainty in who and what to trust: Young people have flatter hierarchies of trust across media, having confidence in posts from friends (58%) and influencers (42%) as much as – and sometimes more than – established journalism. One-third (33%) trust alternative internet-based media personalities vs 12% of 28-65s.
Mahon highlighted four ways that the change in consumption by platform impacts what is consumed: short form means less detail; speed means less context; the algorithms move the salacious faster to the top of feeds; solo viewing reduces socialisation of points of view, therefore reducing the likelihood that radical or socially destructive perspectives will be questioned. “Algorithms designed to elicit anger, surprise or outrage have a devaluing effect on the currency of reliable information,” she said. “The business model of the technology giants is at odds with the safety of our societies.”