Before we begin, I want to clarify that a human authored this article. However, there are reports stating that News Corp, among other media corporations, is utilizing generative AI to produce around 3,000 news stories per week in Australia. It is becoming increasingly common for media companies worldwide to employ AI for generating content.
It is widely known that large language models like GPT-4 do not generate factual information; instead, they predict language. ChatGPT can be compared to an automated mansplaining machine – often incorrect but always confident. Even with human oversight, it is concerning when this AI-generated content is presented as journalism. Apart from the problems of inaccuracy and misinformation, it also leads to poor quality reading material.
Content farms have been around for a long time, and media outlets have been publishing low-quality content even before the arrival of AI. However, the change lies in the speed, scale, and reach of this content. Due to News Corp’s extensive reach in Australia, their use of AI deserves attention. The AI-generated material seems to be limited to “service information,” such as articles about finding cheap fuel or traffic updates. Nonetheless, we should not be too reassured because it indicates the direction in which things may be heading.
In January, tech news outlet CNET was caught publishing articles generated by AI that contained numerous errors. Since then, many readers have been anticipating a surge in AI-generated reporting. In response, CNET employees and Hollywood writers are unionizing and striking, protesting against AI-generated writing and demanding better safeguards and accountability regarding AI usage. Should Australian journalists also join the call for AI regulation?
The use of generative AI is part of a broader trend where mainstream media organizations are resembling data-hungry, algorithmically optimized digital platforms desperate to monetize our attention. The opposition of media corporations to crucial Privacy Act reforms, which would impede such behavior and protect us online, clearly demonstrates this strategy. The traditional media’s dwindling profits in the digital economy have led some outlets to adopt the surveillance capitalism business model of digital platforms. If you can’t beat them, join them. Adding AI-generated content into the mix will worsen the situation, not improve it.
What will happen when the web becomes predominantly filled with AI-generated content, to the extent that new models are trained not on human-made material but on AI outputs? Will we be left with a cursed digital ouroboros, where AI consumes its own creations?
Jathan Sadowski has coined the term Habsburg AI, comparing it to an inbred European royal dynasty. Habsburg AI is a system so heavily trained on other generative AIs’ outputs that it becomes a mutated entity, characterized by exaggerated and grotesque features.
It is worth noting that research indicates that large language models like ChatGPT deteriorate quickly when trained on data created by other AIs instead of original human material. Another study found that without fresh data, an autophagous loop ensues, resulting in a decline in content quality. One researcher stated, “we’re about to fill the internet with blah.” Media organizations that rely heavily on AI-generated content are exacerbating this issue. However, perhaps there is a glimmer of hope in this dark scenario – rampant AI-generated content could ultimately lead to its own demise.
AI in the media doesn’t have to be entirely negative. There are other AI applications that can benefit the public, such as improving accessibility by assisting with tasks like transcribing audio content, generating image descriptions, or facilitating text-to-speech delivery. These applications are genuinely exciting.
Attaching a struggling media industry to the bandwagon of generative AI and surveillance capitalism is not in Australia’s long-term interest. People in regional areas deserve authentic local reporting, and Australian journalists deserve protection from AI encroaching on their jobs. Australia needs a robust, sustainable, and diverse media ecosystem that holds those in power accountable and keeps people informed, rather than replicating the problems exported from Silicon Valley. Samantha Floreani, a digital rights activist and writer based in Naarm.