🌐Strategy note #8: What comes after the news?
Building an "epistemic sensorium" for the age of AI (Part 1)
I’ve been doing some AI strategy work in the media industry recently, which has been 👀eye-opening👀. In the next few strategy notes I will explore how, in a near future of increasingly centralised and weaponised news media, a new generation of epistemic technologies will be needed to complement — and potentially replace — the traditional role of “the news” in democratic societies.
😵💫Background: addressing the epistemic crisis
I’ve been writing my weekly newsletter covering the AI and emerging tech beat for over 5 years now. It has evolved substantially over that period, but still each week I manually scan across hundreds of primary and (mostly) secondary news sources, do my best to parse the signal from the noise and package it up into a stream-of-consciousness snapshot of the technological zeitgeist. I’ve honed my information feeds, sources and curation process for years… if you’re reading this, I guess it still works reasonably well.😇
BUT: the information I’m scanning is a tiny droplet in the oceans of data created every week. Since starting the newsletter in 2020, one analysis estimates that the total amount of data/information created, captured, copied, and consumed annually has nearly tripled. The same report forecasts that number will reach nearly 200 Zettabytes (a 2 with 23 zeroes after it!) in 2025, with that amount doubling again by 2028:
In parallel to this exponential increase in the quantity of information being created, it is also getting more challenging to establish the veracity of information discovered online. Traditional “authoritative” sources of knowledge: scientific peer review, journalists, academics and established institutions all face declining trust as technology platforms disintermediate and misinformation ecologies expand.
Put simply: the cognitive workload to verify the reliability of thousands (millions?) of online news sources — let alone make sense of them — now far exceeds the computational firepower of our ~90-billion-neuron wetware… The degree to which anyone in 2025 can reliably parse raw, unfiltered digital information streaming directly into our brains is rapidly trending towards zero.
This is the Epistemic Crisis of our times: a growing state of widespread uncertainty and instability regarding knowledge, truth, and how we know what we know.
There are many possible societal responses to this new “post-truth” environment. What I will argue is urgently needed — at individual, organisational and societal levels — is an ecosystem of epistemic technologies to support and augment the overwhelmed cognitive capabilities of our human brains to process these rivers of unreliable data: what I call the Epistemic Sensorium:
I’m part way along the path of developing a “sensorium” of AI tools for my own use… more on that journey in my next post. This week, however, I’ll focus on acute challenges with the current news media which make these technologies increasingly critical.
🗞️Defining “the news” and its utility
First off some definitions. For as long as I’ve been alive, I have avidly consumed “the news” to get a sense of what is going on in the world. But what exactly is “the news”?
Claude 3.7 Sonnet’s first-shot response asked to define “news”:
“News refers to information about recent events, developments, or changes that are of interest or significance to a particular audience. Key characteristics of news include:
Timeliness - News is typically current or recent information
Relevance - It matters to the intended audience
Impact - It affects people's lives, decisions, or understanding
Prominence - It often involves notable people, places, or institutions
News can be delivered through various media channels including:
Traditional formats: newspapers, television broadcasts, radio
Digital platforms: websites, social media, mobile apps, newsletters
Word of mouth and community sharing
News content may cover diverse topics such as politics, economics, science, technology, health, entertainment, sports, and human interest stories. Professional journalism typically adheres to standards of accuracy, fairness, objectivity, and ethics, though these standards vary across different news organisations and contexts.“
(Diplomatically put, at the end.)
More analytically, what is the utility function of news?
“The utility function of news serves multiple purposes across individual, social, and systemic levels:
For individuals:
Information acquisition - Provides knowledge needed for daily decision-making
Risk management - Alerts people to potential threats or opportunities
Sense-making - Helps interpret complex events and their implications
Entertainment - Satisfies curiosity and interest in novel information
For societies:
Shared knowledge base - Creates common understanding of current events
Social cohesion - Facilitates conversation and connection around shared information
Democratic function - Enables informed civic participation and accountability
Cultural transmission - Perpetuates and evolves societal values and norms
For systems:
Market efficiency - Distributes information that influences economic behavior
Power monitoring - Scrutinizes authorities and institutions ("watchdog" function)
Agenda-setting - Influences which issues receive public and policy attention
Historical documentation - Creates the "first draft of history"
The relative importance of these functions varies depending on the context, the specific news outlet, and how individuals consume and process news information. Different stakeholders (audiences, journalists, publishers, advertisers, governments) may prioritize different aspects of this utility function based on their own interests and values.”
Note: Claude does NOT think that the utility function of news is to:
Influence people to buy an advertised product or service
Influence people to believe in a political ideology or to vote for a certain political party or issue
Bear that in mind for the following discussion…
💰 🏢 📱📉Current news media trends
Multiple concurrent trends are disrupting the traditional “fourth estate” of journalism in media markets worldwide. The societal function of “the news” that we’ve become accustomed to for decades is now deconstructed in just a couple of decades… increasingly replaced by a cacophony of AI-generated white noise and an absence of sense-making consensus, into which directed dis-/mis-information ecologies are spilling.
Here are ten macro trends for “the news” that I’ve been tracking:
Dying business models Traditional news publishing and broadcasting business models are in terminal decline. Internet advertising has hoovered up most traditional revenue streams, while state-funded broadcasters struggle with budgets to invest in modern technology to remain relevant.
Subscription-based news services struggle to build and retain audiences. Here in my home country Aotearoa, the number of people willing to pay for news is less than 25% of the audience:
“In 2024, the proportion of those who are paying for digital news grew slightly from 23% in 2023 to 24% in 2024. When compared internationally, New Zealanders are in third place after Norway and Sweden in paying for news.”
(Note: If you aren’t paying for the product, you are the product.)
Declining trust in news media - a trend happening in most free-speaking democracies worldwide. Examples:
USA:
Gallup (via Axios 2024) Again, my home country Aotearoa New Zealand:
…Although it looks like they’re doing something right in Finland:
Oligarchic media ownership billionaire purchases of the few remaining critical-mass mastheads tacitly — if not explicitly — concentrates media ownership and shifts their underlying purpose from informing to persuading. (The most striking recent example being Jeff Bezos and The Washington Post: from “Democracy Dies In Darkness” to just “Democracy Died” — of course, media barons have been a thing for over a century.)
Just last week, Auckland-based Canadian billionaire Jim Grenon acquired a nearly 10% stake in NZME (publisher of the largest national newspaper the NZ Herald and boomer-centric talkback radio station Newstalk ZB) — and is proposing to replace most of the current board members with himself and three other nominees. What could his agenda be…?(Grenon is the financial backer of “The Centrist” - an online news outlet which is…anything but:)
Not only are [br]oligarchs buying the means of news production, what’s more sinister are coordinated efforts to silence competition from the commons. Recently the Wikimedia Foundation stated that it is preparing for an 'Increase in Threats' to US Wikipedia Editors From Musk and his allies and will likely roll out features previously used to protect editors in authoritarian countries more widely.
Algorithmic, biased, news filtering increasingly what we’re presented with when browsing the news online is not the result of human editorial decisions but clickbait news headlines, optimised to trigger a limbic response and ultimately achieve a commercial or political outcome.
Personalised algorithmic news feeds, primarily from social media companies, result in fragmentation of consensus narratives into “filter bubbles” of content. As a result, algorithmic news is changing the way we interact and shaping speech.
“Bias Splitting” by social media platforms and news outlets themselves uses AI to algorithmically filter the news and headlines which reach the user… creating a filter bubble (or “echo chamber”) of bias-reinforcing content, which can influence and reinforce political viewpoints with great effect.
Verity News is a free, non-profit website which:
“… aims to counter misuses of artificial intelligence that have resulted in a distorted online news environment, where alternative facts often overshadow scientific truths, and fractured narratives contribute to social discord. Verity's aim is to empower people to discover the complete and nuanced truth behind every major news story. It does this by separating facts from narratives.”
Verity demonstrates explicitly how algorithmic filtering can present the same news story through differing sets of biases that shape the narrative (“bias splitting”)1. For example, Right wing, Pro-establishment:
vs. Left-wing, anti-establishment:
Increased algorithmic news filtering is correlated an increased polarisation of news narratives, although definitive causal relationship is hard to establish.
Let’s not forget that there are documented biases inside AI language models themselves as well. In 2023, researchers came up with this assessment of the political biases of the leading LLMs of the time:
Auditing LLMs for bias is at an emerging science in its very earliest stages:
“Auditing Large Language Models (LLMs) to discover their biases and preferences is an emerging challenge in creating Responsible Artificial Intelligence (AI). While various methods have been proposed to elicit the preferences of such models, countermeasures have been taken by LLM trainers, such that LLMs hide, obfuscate or point blank refuse to disclosure their positions on certain subjects.“
Tech platforms control distribution Now in 2025, a handful of critical-mass internet platforms control the majority of news distribution — or at least the mass allocation of attention: Meta, Google (YouTube), TikTok and X.
This despite Meta deliberately taking steps to remove news links on their social network feeds rather than engage with regulatory efforts to tax large tech companies for linking to news.A study from Pew Research Center last year confirmed that Meta, Google (YouTube), TikTok and X increasingly control the distribution of news for US adults:
Same here in Aotearoa, although according to JMAD’s latest Trust In News in Aotearoa New Zealand report, only TikTok and Instagram were growing their share of social media-sourced news in 2024:
AUT research centre for Journalism, Media and Democracy (JMAD) fifth annual Trust in News in Aotearoa New Zealand report, 2024 Now add into the mix the advent of AI “answer engines” which summarise the “news” for us (Perplexity, ChatGPT, Google itself…), and in particular with X integrating its Grok chatbot directly into the feed UX, generative AI now adds another intermediary layer between the original news source and the consumer.
Pull-backs from content moderation and fact-checking The politically fraught tension between enabling “free speech” and enforcing content moderation on the large tech platforms has swung violently towards “anything goes” with the re-election of Donald Trump in the US. Meta's recent pivot to replace professional fact-checkers with crowdsourced “community notes” moderation arguably leaves their platforms wide open to spreading misinformation… hence reducing their utility for gathering accurate news. (Meta themselves don’t really care… they just want your attention to view paid ads…)
AI-generated news content All of the preceding is going on against a backdrop of a technological “race-to-AGI” where content production, content framing and content consumption are increasingly outsourced to AI applications.
The phenomenon of AI-generated deepfake images and videos now creates uncertainty that anything we read or see online is authentic from now on. Deepfake technology surged last year in democracies holding elections:
Reuters Institute last year reported on how AI-generated “slop” is quietly conquering the internet. As witnessed by the data modelling at the top of this post, the sheer quantity of information being produced each year is increasing exponentially… AI-generated content masquerading as “news” is a significant part of that.
For example, startup Channel1, pitched last year, envisions the first “entirely AI-generated news channel” with multi-lingual AI avatars for presenters2:
Alternative news platforms (more) decentralised self-publishing platforms such as Substack and Ghost have sprung up in a direct response to increasing concentration of news production and distribution… but it remains to be seen whether subscription-based revenue models only work sustainably for the very few top-ranked writers.
Low media literacy often lamented as the decline of critical thinking, one 2022 survey revealed that a significant portion of US adults and students reported little to no formal education in media literacy: for instance, only about 42% of respondents indicated they were taught how to analyse science news in high school, and just 38% recalled opportunities to reflect on media messages.
(Whether these findings actually represent a secular decline in media literacy over time — or just the issue showing up in response to the 21st century information environment — is up for debate. There are few longitudinal studies or methodologies going back into the historical record that I can find.)
One recent study does suggest that underrepresented communities are often targets of disinformation efforts. Certainly the euphemism “low information voters” used in US politics indicates that democracy is actively being “hacked” by hijacking the limbic systems of voters with low levels of media literacy.Enclosure of the knowledge commons …finally, something I’m watching closely is how the availability of public news archives are slowly being made less accessible — or removed from the internet entirely… and potentially lost to history.:
At the end of last year, Google closed down its “cache” feature and now instead diverts users to the Internet Archive (Wayback Machine) - a not-for-profit entity running on a shoestring budget… and as proved last year vulnerable to cyberattack.
The under-the-radar archive.today is another alternative historical archive, with a mysterious solo operator and financial backing… a “one-man battle against entropy”. Lots of copyright holders continue to try to take this site offline… one day we may just wake up and find it isn’t there.
In the last month, the second Trump administration has been driving a rapid shutdown of public information websites and online databases… leaving researchers to scramble to locate missing data:
“Social science researchers and other federal data users on Monday described feeling like a five-alarm fire was triggered when they discovered late last week that vital federal datasets were inaccessible.“
That said, a counter-trend is that the “entire digitised content of human history” can be downloaded as torrents of raw “tokens” from any number of locations frequented by machine learning engineers… here’s a tweet from the last 24 hours intimating the existence of a 5.1-trillion token dataset called “World 3.5” at the frontier:
Having this raw dataset — and open-source AI models trained from it — will be essential information asset to be able to recall human history as it was recorded at the time, not as it might be (re)interpreted by AI tools in the future.
In summary
Ten trends in news media which are reshaping the modern information environment:
Dying business models
Declining trust in news media
Oligarchic media ownership
Algorithmic, biased, news filtering
Tech platforms control distribution
Pull-backs from content moderation and fact-checking
AI-generated news content
Alternative news platforms
Low media literacy
Enclosure of the knowledge commons
Many of these trends go against what might be considered the “utility function” of news media for individuals, societies and societal systems.
In the next post I’ll explore the range of epistemic technologies which can help to counter these trends.
Thanks for reading!
Here’s the complete set of sliders that Verity currently supports:
The sliders are not entirely transparent in how they work under the hood - and Verity says that it uses humans to curate the stories themselves. Interestingly Verity also relies on prediction markets to provide some degree of “trust in the truth”:
For those readers more interested in probability, we strive to include "Metaculus predictions" where possible. These provide forecasts of the most likely outcome of an event, according to the Metaculus prediction platform and aggregation engine. Framed as an interactive chart, you can further see how these predictions have changed over time by hovering over various points of the graph.
I can't help feeling "Although it looks like they’re doing something right in Finland" is increasingly the most salient statement of the 2020s 😂
Seems that the media-influencer have proven that 'trust' is more important than 'truth' We're in a post-truth world where there is no agreement on what truth is. The media talking heads have shown that setting the trust agenda then determines the truth agenda. So that leads to the antagonists that picked up and ran with the streams of underlying mis-trust and discontent within society niches and used this for income, status and reins of power. What is hidden are the indirect links between the public media and those seeking control of gov policies.