AI: It Comes in Peace... to Help, Not Replace
- Kelly Hendrick
- Jun 3, 2024
- 2 min read
Updated: Jul 6, 2024

There’s an understandable the-sky-is-falling mentality around AI, one that’s touched many fields: healthcare fears the potential security breaches of sensitive patient information, education is coping with generative text of students, and computer programmers worry about becoming obsolete. This year (and possibly next), we’ll see the aftermath of last year’s writer and actor strikes, which was in response to feeling replaced by AI onscreen and behind the camera. The field of Communications is no exception; journalists have been forced to balance the positive effects AI has had on their workload against the extra work it may demand. Because of this, the key is to not think of AI working against content creators but as a tool they can work with.
Technology's Effects on How We Work
Using AI has become an important tool for investigative journalists, whose work has evolved as technology itself evolves. The Wikileaks in 2010 composed a file size 1.7GB while the Pandora Papers released in 2021 were a massive 2.94TB—a 1700% increase, which means AI is allowing journalists to extract data that might otherwise be inaccessible (Fridman et al., 2021). They can also use AI as a way to decrease repetitive processes, leading to increased efficiency and improved accuracy. A 2021 AP survey of almost 200 news organizations found that journalists were looking forward to at least some aspects of AI, including transcription and social media content.

Technology's Influence on the Training We Need
As of right now, though, AI is still just a tool for communicators to work with, which means it’s another piece of technology they have to learn. They have to not only look at what AI's done but also be prepared for what it's going to do next. This means they now have to learn how to:
Interpret What's Been Done
A computer cannot navigate bias the same way a human can, especially if the dataset is already discriminatory. A computer’s visualizations and interpretations will be skewed if a dataset is incomplete—causing unreliable, inaccurate, and potentially harmful news. Without human oversight, AI content could fall short of editorial standards (Shaw, 2023). This means content creators will have to learn how to apply their skills to someone else's work.
Self-Educate on What Comes Next
Not every problem the media faces can be solved with technology. Journalists need to be ever aware of developments and trends so that they can keep learning as AI keeps growing. Open-source tools are consistently being developed, and data journalism has begun to make a place for itself. The field incorporates data analysis, visualizations, and multiple databases to tell stories (Fridman et al., 2023), which means communicators may need to have a base knowledge of these tools.
(This article written for you by a person, not a computer.)
References
Fridman, M., Krøvel, R., & Palumbo, F. (2023). How (not to) run an AI project in investigative journalism. Journalism Practice. https://doi-org.ezproxy.snhu.edu/10.1080/17512786.2023.2253797
Shaw, J.J. (2023, March 2). ChatGPT, AI, and journalism: Legal and ethical pitfalls explained. https://pressgazette.co.uk/comment-analysis/ai-journalism-legal-ethical-considerations/
Comments