The future of artificial intelligence (AI) in journalism hinges on a delicate balance between its benefits and risks, as media outlets strive to enhance audience trust and news literacy. As AI technologies reshape how information is disseminated and consumed, the implications for news organizations are profound.
AI summaries, which companies like Google refer to as “AI overviews,” are increasingly prevalent, offering users a convenient way to access information. However, these AI-generated summaries are often criticized for misrepresenting facts. Additionally, they can siphon traffic from news websites, posing a direct threat to the financial viability of journalism. With AI capable of re-packaging content—sometimes even from paywalled sources—readers may find little incentive to click through to original articles, effectively rendering individual stories less valuable.
The challenge for news organizations now is navigating this “zero-click” world, where convenience often trumps the necessity of engaging with original content. This shift has sparked a pressing question: how can journalism create value in an era where AI is so readily accessible?
Trust in news has become a significant concern in the AI landscape. A recent survey from the Digital News Report: Australia revealed that while nearly half of respondents (47 percent) believe that AI-generated news is cheaper to produce, many harbor skepticism. About 43 percent perceive AI news as less trustworthy, and 38 percent consider it less accurate than news reported by human journalists. Participants also expressed concerns regarding the potential for algorithmic bias, with one individual noting that AI could exacerbate existing biases due to the nature of its programming.
As AI takes more prominent roles in content creation, audiences struggle to discern the origins of news articles. One participant articulated a common sentiment: “I wouldn’t be able to say if I’ve seen it used to generate an article because I wouldn’t be able to pick what’s being written by a person, what’s being written by AI.” This uncertainty threatens the perceived value of journalism, especially if audiences are unable to distinguish between AI-generated and human-produced content.
The role of human journalists remains critical in this evolving landscape. Despite advancements in AI, research indicates that trust in journalism hinges on the human element. According to the same report, 43 percent of respondents are comfortable with news produced by journalists who utilize AI assistance, a figure that drops to 21 percent when AI plays a more dominant role. Concerns about AI’s effectiveness in handling sensitive topics and cultural contexts further complicate audience acceptance.
Interestingly, those who have undergone news literacy education demonstrate greater acceptance of AI-generated content. Approximately 40 percent of individuals educated in news literacy feel comfortable with news produced mainly by AI, compared to just 15 percent of those who have not received such education. This gap underscores the importance of fostering media literacy to instill confidence in navigating complex information landscapes.
Transparency regarding AI’s role in news production is essential for maintaining audience trust. Some organizations, including the ABC and The Guardian, have proactively disclosed their use of generative AI, fostering a sense of accountability. However, many smaller news outlets either lack clear AI policies or do not publicly share their AI utilization practices, creating confusion among audiences who are often unaware of how AI influences journalism.
Audience preferences are shifting as younger generations increasingly gravitate toward AI-assisted news consumption. A third of respondents indicated interest in receiving personalized news summaries, while others expressed enthusiasm for tailored story recommendations. This trend highlights that convenience and relevance are primary drivers for younger audiences, with those under 35 showing a heightened desire for AI to facilitate easier comprehension of news content.
As journalism navigates the complexities introduced by AI, the future will likely depend on the industry’s ability to balance technological advancements with the need for trust and transparency. Raising audience comfort levels and improving news literacy will be crucial for maintaining the integrity of journalism in an AI-driven world.
Professor Sora Park is the Director of the News and Media Research Centre at the University of Canberra, and Dr. TJ Thomson is a senior lecturer and Australian Research Council DECRA Fellow at RMIT University.
This article is part of a series on AI, Journalism and Democracy. For further insights, visit the original reports by 360info.
See also
Globant’s Converge 2025: Industry Leaders Set to Transform AI from Ideation to Impact
Elon Musk’s Grok AI Allegedly Doxxes Users, Reveals Sensitive Home Addresses
India’s Power Minister Unveils AI Solutions to Transform Electricity Distribution, Boost Efficiency
Applications Open: 2026–2027 AI for Science Master’s at AIMS South Africa with Full Scholarships
Dassault Systèmes Expands Mistral AI Partnership, Targeting €7.6B Revenue by 2028


















































