Thursday, December 26, 2024 | Jumada al-akhirah 24, 1446 H
broken clouds
weather
OMAN
22°C / 22°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Journalism, AI, and ethical challenges

minus
plus

A peep into the intersection of journalism and artificial intelligence (AI) highlights the need for new journalistic standards. Ethical implications, intellectual property, content verification, and moderating are just a few of the issues that lie ahead.


Currently, AI lacks awareness. It cannot explain why it wrote a certain article; it is unaware of misleading content, and AI cannot be held accountable, though it can shape public opinion.


Understanding what constitutes responsible use of AI goes beyond the application of large language models (LLMS) in a journalistic context. Each person learns differently; the same goes with AI, which relies heavily on data and information available.


While AI is not new, it is now in a string of constant development and changes. For decades, we have used chatbots to do interactive journalism and provide automated news updates. We have LLMS that helps with fact-checking by cross-referencing information, and we are familiar with using LLMS to create personalised content based on user’s preferences and reading records.


Automation and later artificial intelligence in journalism were already discussed in the 1980s, 1990s, and 2000s. That was also the time when full computerisation of the newsroom as well as the civilian internet was adopted by the news and media industries.


From the mid-2000s to the 2010s, machine learning algorithms become an essential component of journalism. AI was simply known as automated or automatic processes. Distribution, content recommendation algorithms, and data analysis were used. Early accounts of the use of AI in journalism take us back to 2014 with the Associated Press automating stories about corporate earnings.


News organisations have been under pressure to adopt AI as a result of technological advancement but also due to market pressures, financial challenges, and competitiveness. But there is a catch! News organisations utilise AI tools, infrastructure, and configurations provided by major tech companies like Google, Amazon, and Microsoft. These monopolies are the backbone of the current media system. This is a key area that needs scrutiny: control of structure confers power!


One of the most common applications of LLMS is automated reporting, which writes structured news articles such as financial data, and sports scores. Machine learning algorithms are increasingly being used to interpret and generate human-sounding language.


We have already leaped into content optimisation through image recognition, newsworthy event detection, real-time transcriptions, and automatic translation to multiple languages. What has become clear is the reshaping of the information ecosystem.


While all these are exciting, there are profound concerns about artificial intelligence and its potential impact on elections, privacy, liability, and news. We are at a stage, where scrolling through the feed, we don’t know if we can trust what we are reading. In times of fake news, wide-ranging uncertainty, and dissatisfaction, the average news consumer cannot truly identify whether a news text was written by a person or created by a machine; whether it is accurate or misleading.


AI moved from a novelty to a pivotal component in journalism; it is now less of an innovation and more of a challenge with ethical and practical implications, including its prospects. One of the main priorities is to have a clear and transparent understanding of the legal artificial intelligence policies in the news and its ethical considerations and responsibilities. It is essential that a policy establishes clear boundaries for what AI may be used for and how the editorial process should manage quality.


People and machines can team up successfully. The aim is to recognise which areas people are essential to and when and where technology is used. Artificial intelligence, hopefully, is not to replace editors; rather, it is to optimise processes and adapt content to different formats. Not every problem in the news can be addressed with technological solutions. Emotional intelligence, empathy, and unique skills are essential for storytelling.


SHARE ARTICLE
arrow up
home icon