Ethical Challenges of Artificial Intelligence In The Newsroom

Artificial intelligence (AI) integration into all aspects of our society is happening rapidly. That includes journalism and public media newsrooms. Journalists now have a new set of tools at their disposal. However, it raises important questions about ethical use, accuracy, and the potential impact on journalistic practices.

Nick Diakopoulos, an associate professor at Northwestern University, is at the forefront of addressing these concerns through his “Generative AI in the Newsroom Project.” He emphasizes the need for careful consideration of how AI technologies are implemented in news production, highlighting ethical implications and the limitations of AI models. Diakopoulos advocates for a thorough evaluation of AI tools’ utility in journalism and stresses the importance of maintaining high standards of accuracy and integrity.

Generative AI, particularly large language models like ChatGPT, presents both opportunities and challenges for news outlets. While some organizations have embraced these technologies for tasks such as automated article writing, others have implemented restrictions to ensure human oversight. Diakopoulos predicts that most news operations will adopt an approach requiring human involvement before publishing AI-generated content.

One of the significant limitations of generative AI is its inherent inaccuracy, as evidenced by Diakopoulos’s audit of Microsoft’s Bing chatbot integrated with ChatGPT. Nearly half of the chatbot’s responses were inaccurate, highlighting the need for ongoing evaluation and refinement of AI systems.

Despite concerns about AI replacing journalists, Diakopoulos believes that AI will instead complement human journalists, creating new roles and opportunities in editing, story gathering, and fact-checking. He envisions a future where AI-generated transcripts can free up journalists’ time for more meaningful tasks.

AI offers a wide range of applications to assist journalists across various stages of news production:

  1. Content Discovery: AI tools like ChatGPT sift through vast amounts of data from diverse sources to identify relevant information for news stories.
  • Document Analysis: AI assists in processing and summarizing extensive public records, categorizing documents, and providing quick insights for reporters.
  • Translation: AI aids in translating published stories into multiple languages or processing raw data in different languages, though human translation remains essential for accuracy.
  • Tips Processing: AI can sort through tips submitted via email or other channels, identifying potentially newsworthy content for further investigation.
  • Social Media Content Creation: AI assists in summarizing textual content for social media posts, facilitating engagement with the audience.
  • Automated Writing (Structured Data): AI generates text based on structured data inputs, such as sports scores or weather forecasts, though accuracy and relevance must be ensured.
  • Automated Writing (Unstructured Data): AI extracts structured data from unstructured sources like press releases and generates text, requiring human oversight for accuracy verification.
  • Newsletters: AI personalizes newsletter content to cater to individual interests, enhancing reader engagement.
  • Text Summarization: AI helps summarize large amounts of information, enabling journalists to identify key points quickly.

Comment Moderation: AI classifies comments for moderation purposes, though human oversight is necessary to minimize errors.

Content Transformation and Reuse: AI aids in formatting articles for reuse on different platforms, expanding the reach of journalistic content.

Search Engine Optimization: AI suggests variations of headlines for A/B testing and generates metadata for SEO purposes, improving the discoverability of news content.

Push-Alert Personalization: AI helps personalize push alerts for different audience segments, increasing engagement with news content, though human oversight is necessary to ensure relevance and accuracy.

    Despite what AI can do, newsrooms need to consider where its uses lead to ethical conflicts. “I think it will take some time for news organizations to develop best practices,” said Jared Schroeder, an associate professor specializing in media law and technology at the University of Missouri School of Journalism.

    “There is no set best practices yet and we have two problems: It’s new and it’s changing. We are not done. The AI of today will be different next year and in five years,” he added.

    Are your journalists using AI? What guardrails are in place to ensure it doesn’t cross the line of appropriate use?

    Post your thoughts below.



    Dave Edwards helps public media professionals become more effective leaders through executive coaching and consulting services.  He previously transformed WUWM Milwaukee Public Radio into one of the country’s most successful public radio stations and served as chair of the NPR Board of Directors. He also teaches classes at Marquette University and online. He blogs on productivity and management-related issues at www.DaveEdwardsMedia.com.
    Find out more about the services Dave provides to public media HERE.

    Leave a comment