Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
Comment
Cam Wilson

Nine’s Georgie Purcell image shows the risks of AI in newsrooms

A major Australian media company depicting a female politician in skimpy clothing seems like a  regressive act reflecting sexist attitudes of the past. Yet this happened in 2024 — and is a taste of the future to come.

On Monday morning, Animal Justice Party MP Georgie Purcell posted to X, formerly Twitter, an edited image of herself shared by 9News Melbourne.

“Having my body and outfit photoshopped by a media outlet was not on my bingo card,” she posted. “Note the enlarged boobs and outfit to be made more revealing.”

Purcell, who has spoken out about gendered abuse she regularly receives as an MP, said she couldn’t imagine it happening to one of her male counterparts.

After Purcell’s post quickly gained a lot of attention online — a mix of shock and condemnation — Nine responded. In a statement provided to Crikey, 9News Melbourne director Hugh Nailon apologised and chalked the “graphic error” up to an automation error: “During that process, the automation by Photoshop created an image that was not consistent with the original,” he said.

While not using its name, Nailon seems to be saying the edited image was the result of using Adobe Photoshop’s new generative AI features, which allows users to fill or expand existing images using AI. (An example Adobe uses is inserting a tiger into a picture of a pond). Reading between the lines, it appears as though someone used this feature to “expand” an existing photograph of Purcell, which generated her with exposed midriff rather than the full dress she was actually wearing. 

Stupidity, not malice, could explain how such an egregious edit could originate. Someone who works in a graphics department at a major Australian news network told me that their colleagues are already using Photoshop’s AI features many times a day. They said that they thought something like what occurred regarding Purcell would happen eventually, given their use of AI, limited oversight and tight timeframes for work.

“I see a lot of people surprised that the AI image made it all the way to air but honestly there is not a lot of people checking our work,” he said.

As someone who’s worked in multiple large media companies, I can attest to how often decisions about content that’s seen by hundreds of thousands or even millions of people is made with little oversight and often by overworked and junior employees. 

But even if you buy Nine’s explanation — and I’ve seen people casting doubt on whether the edits could have happened with AI without being specifically edited to show more midriff — it doesn’t excuse it or negate its impact. Ultimately one of the biggest media companies in Australia published an image of a public figure that’s been manipulated to make it more revealing. Purcell’s post made it clear that she considers this harmful. Regardless of the intent behind it, depicting a female politician with more exposed skin and other changes to her body has the same effect, although not as severe, as the deepfaked explicit images circulated of Taylor Swift last week.

The Purcell image is also telling of another trend that’s happening in Australian media: newsrooms are already using generative AI tools even if their bosses don’t think they are. We tend to think about how the technology will change the industry from the top down, such as News Corp producing weekly AI-generated articles or the ABC building its own AI model. The UTS Centre for Media Transition’s “Gen AI and Journalism” report states that major Australian media newsroom leaders say they’re considering how to use generative AI and don’t profess to be meaningfully using it in production yet.  

But, like in other industries, we know Australian journalists and media workers are using it. We might not have full-blown AI reporters yet, but generative AI is already shaping our news through image edits or the million other ways that it could — and probably is already — being used to help workers, such as by summarising research or rephrasing copy.

This matters because generative AI makes decisions for us. By now, everyone knows products like OpenAI’s ChatGTP sometimes just “hallucinate” facts. But what of the other ways that it shapes our reality? We know that AI reflects our own biases and repeats them back to us. Like the researcher who found that when you asked MidJourney to generate “Black African doctors providing care for white suffering children”, the generative AI product would always depict the children as Black, and even would occasionally show the doctors as white. Or the group of scientists who found that ChatGPT was more likely to call men an “expert” and women “a beauty” when asked to generate a recommendation letter. 

Plugging generative AI into the news process puts us in danger of repeating and reinforcing our lies and biases. While it’s impossible to know for sure (as AI products are generally black boxes that don’t explain their decisions), the edits made to Purcell’s picture were based on assumptions about who Purcell was and what she was wearing — assumptions that were wrong. 

And while AI may make things easier, it also makes the humans responsible for it more error-prone. In 1983, researcher Lisanne Bainbridge wrote about how automating most of a task made more problems rather than less. The less you have to pay attention — say by generating part of an image rather than having to find another — the greater the chance that something goes wrong because you weren’t paying attention. 

There’s been a lot of ink spilled about how generative AI threatens to challenge reality by creating entirely new fictions. This story, if we are to believe Nine, shows that it also threatens to eat away at the corners of our shared reality. But no matter how powerful it gets, AI can’t yet use itself. Ultimately the responsibility falls at the feet of the humans responsible for publishing.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.