AI tools present an existential threat to diversity and editorial independence in journalism

University News Last updated 16 June 2023

The journalism industry must ensure strict guidelines are in place in newsrooms to combat the threat that Generative AI tools such as ChatGPT, BARD, and DALL-E pose to diversity and editorial independence, three media experts from Birmingham City University have cautioned.

Sir Lenny Henry Centre for Media Diversity

Birmingham City University

Leading academic and industry figures Diane Kemp, Marcus Ryder, and Paul Bradshaw, from the Sir Lenny Henry Centre for Media Diversity, have produced an advisory document for media professionals to avoid propagating in-built bias and amplifying diversity problems already present in journalism if using Artificial Intelligence apps and services.

According to a survey by the World Association of News Publishers currently half of all newsrooms use Generative AI tools, yet only a fifth have guidelines in place, it is unclear if any of these guidelines explicitly address diversity and inclusion. 

Paul Bradshaw, Course Leader for MA in Data Journalism at Birmingham City University, said: As journalists start to experiment with different ways to incorporate generative AI tools such as ChatGPT into their workflow, it’s vital that we think about how editorial independence is maintained. 

“A central part of our role as journalists is giving a voice to the voiceless and shining a spotlight on important issues.

“These principles are a first step towards establishing those best practices.” 

Under the six guiding principles, which were peer-reviewed by colleagues in journalism and academia, journalists are urged to report mistakes and biases, build diversity and transparency into prompts, and view text or copy results with a healthy scepticism, if using Generative AI.

The guidelines suggest a lack of plurality in ownership of media outlets and a lack of diversity and representation in original source material will continue to cause inherent imbalances and inaccuracies across generative AI industry unless rebalanced.

Marcus Ryder, Head of External Consultancies at the Sir Lenny Henry Centre for Media Diversity, said: Journalists urgently need a set of guidelines of how to work with ChatGPT and generative AI programmes responsibly that does not exacerbate existing diversity issues.”

“The Sir Lenny Henry Centre for Media Diversity has been disappointed that in all the recent debate around ChatGPT and generative AI there has been little if any acknowledgement of how it could affect diversity in society general - and in journalism in particular. 

Back to News