News Corp and other media companies have recently announced major partnerships with OpenAI, granting the American artificial intelligence research company access to the content of its global mastheads to “share journalistic expertise”. Google also has an arrangement and Meta is reportedly looking to do the same with other publishers.
First, let me be clear. I celebrate AI companies paying original content holders for training data and outputs. It’s what should have been done in the first place; instead, we had AI products like ChatGPT hoovering up vast amounts of content from news websites, forums, videos and book manuscripts without permission. Without original and accurate content, AI model outputs are prone to errors and fabrications.
But the question must be asked of these deals: to what end and for how long?
Expiry on the horizon
AI deals are different from distribution or syndication deals, which are about finding mutually beneficial ways of distributing content and finding new channels as a source of new audiences.
AI companies only need original content makers until their own AI products learn to become good or accurate enough content makers themselves, particularly if part of these deals is about training them on “journalistic principles”. Once this line has been crossed, tech companies may feel they no longer need those original creators.
Outputs in AI programs don’t have to be great; they just have to be “good enough” for people to accept them as passably credible or accurate. Google’s recent trials into integrating AI with its search engine give a preview of this broken social contract between tech companies and media companies.
When using AI-powered search on Google, results will essentially fill the first page with the company’s own AI-curated summaries, deprioritising linkbacks to publishers and other website owners underneath its summary, significantly reducing attribution and referral traffic to those original website owners.
Google search results and SEO are predicated on the idea that Google takes those valuable snippets of information from external parties for free because it will link back to their respective sites. Google has regularly argued that it is not a publisher, and therefore not subject to publisher rules and regulations because of this. AI summaries seem to be on track to do away with this premise, or restrict the valuable “front page” for Google’s own AI results, or perhaps preferred partners only.
Approaching ‘good enough’
It’s still early days for this setup and the results are still clearly bad. But how long will it take for these AI results to be “good enough” for most?
Individual commercial deals between tech companies and publishers have their flaws, and to date are on shaky ground, such as those facilitated through the Australian news media bargaining code. But tech companies require credible, professional news and public interest journalism to legitimise their platforms as having accurate, informative content. Without it, they become cesspools for conspiracy theories, hearsay and hate speech (like X, the platform formerly known as Twitter, has become), or forcibly saccharine and superficial like Meta’s Threads, which tries to avoid news and “political content” altogether.
But these arrangements continue to be significant as long as audiences use those platforms as sources of news and information — so that we don’t devolve into a society paralysed by misinformation and unable to distinguish fact from fiction.
We therefore need coordinated, national efforts that force tech companies to come to the table for news content and AI (ironically, this was the original design of the news media bargaining code), one that facilitates industry-wide negotiations and collective bargaining arrangements with most if not all media companies. Ones that allow for agreements in perpetuity, or very long-term date ranges, not unlike copyright law.
We could even argue that national-level data agreements are warranted since AI products benefit from publicly available data and everyone’s data footprint. This can only occur at a governmental, regulatory, whole-of-industry level, not simply among pick-and-choose partnerships.
As we’ve seen with Meta and its reluctant commitments to news content, it’s all too easy for tech companies to love us one minute, and then leave us the next.
How do you feel about the relationship between AI companies and media organisations? Let us know by writing to letters@crikey.com.au. Please include your full name to be considered for publication. We reserve the right to edit for length and clarity.