The Topline from TVND.com


The Fight Over Adopting AI In The Newsroom

#

Long before his career in politics, which would ultimately lead him to the White House, Ronald Reagan found fame and fortune as an actor. This fact leads to one of the biggest laughs in 1985’s “Back To The Future” for Christopher Lloyd as “Doc Brown."

One of the roles he would be best known for was not on the big screen, where he played George Gipp in the 1940 film “Knute Rockne, All-American.” But instead, it was on the emerging smaller screen of Television. Reagan was the longtime face of the General Electric company, both as host of the series General Electric Theater and as the commercial spokesperson for the company, where he would deliver the company’s post-World War II advertising messages, which typically ended with him saying: “At GE, Progress is our most important product."

The messaging was very intentional. General Electric, like many large industrial companies after World War II, had to convince the buying public that the booming market for products and services to make life easier, ostensibly, was one that they should embrace and support with their money. Everything from electric refrigerators replacing the iceboxes found in most kitchens to the electric-powered locomotives replacing the steam engines that pulled trains across the nation.

GE made both of those innovations, along with many more of that era. But as always, human beings are often reluctant to embrace change, even when it is in their best interest to do so.

Throughout our time in television news, we have witnessed many changes in how the news is gathered, produced, and delivered.

In our earliest years, during the mid to late 1970s, we were part of the first generation of television news journalists who would move from shooting 16mm film to gather news footage to using small-format cameras and videotape that didn’t require the time-consuming process of developing the film before it could be edited and projected. This change was dubbed “Electronic News Gathering” and given the acronym “ENG." In the 1980s, we saw typewriters and carbon-paper script sets replaced by computers on every desktop, and eventually into the hands of reporters in the field. The 1990s? Graphics systems that enabled high-resolution typography and graphics across every market. By the 2000s, early control room automation systems reduced staffing from at least a half-dozen people down to just a couple.

And today, we all carry devices in our pockets that can nearly match the production power of an entire television station.

Those were just a few examples of the significant changes that come to mind. Like any industry, progress comes with the promise of making things better, faster, and perhaps most importantly to those who own television stations, more economical. Note that we consciously avoided using the word cheaper, because the impetus is not universally to do things “on the cheap” — even if some owners might be more willing to do just that.

Which brings us to the current debate we have been hearing more about lately—the growing adoption of artificial intelligence tools in the newsroom.

Along with that is the ancillary argument over whether or not the use of such tools should be disclosed to the audience and, if needed, just how that should be done. (Yes, we can definitely imagine some station in the future promoting the fact that their news is “100% human-powered.”)

Putting it another way, “progress never happens without some pain in the process.” (Overwrought alliteration notwithstanding.)

Let’s be clear, fear is typically the biggest hurdle to progress. The arrival of AI as a newsroom tool would certainly trigger some uncertainty about the status quo. After all, the changes we detailed a few paragraphs ago created a significant learning curve for doing things differently. And each led to the loss of jobs made superfluous by technological advancements.

The fear of advancing AI in nearly every walk of life is understandable. The rise of artificial intelligence has been depicted throughout science fiction as always the root of humanity's decline, if not its outright destruction. So it is not too surprising to hear that there are those in the newsroom who would see the arrival of A.I. tools as being the equivalent of the moment where Skynet becomes sentient in the plot of “The Terminator."

Unless someone has a super-secret AI implementation in their newsroom that we are totally unaware of, we have to say that, while we completely understand the uncertainty, we do find these fears a bit overblown.

We’re not yet fans of any "virtual anchor" examples we’ve seen, so let’s put that nascent technology aside (at least for the moment) and focus on the editorial tools that are now in place or being planned. Whether arriving in TV newsroom computer systems like ENPS and iNEWS, along with newer challengers like Ross Inception, Octopus Newsroom, and Cuez, or in “outboard” tools, such as Magid’s Collaborator and even Grammarly as a grammar and spell checker, these new tools are at the heart of the battles we are hearing about.

The one thing all of these tools have in common is that, like all current artificial intelligence, the process is not automatic. They start with human input. While you can ask AI to do everything from producing story ideas to writing online versions of broadcast scripts (and vice versa), it does not do any of these things without being asked, or in the current parlance, prompted. 

We’re writing this very article with the help of Grammarly, the AI-powered grammar and spell checker. We will accept some of the program's suggested changes—others we will dismiss “with prejudice,” as our lawyer likes to say. Ultimately, we control what the tool does in our editorial process. As a minimal staff, we are reminded of our days as a newscast producer on a weekend shift, when we mostly worked alone for much of the shift, until the anchor, who also did some reporting each day, got back into the newsroom and could read over the scripts we had written.

We haven’t felt the need to attach a disclosure about the use of an AI tool in our writing, no matter how minimal it might be. We are aware that this is a point of contention in some newsrooms, where there has been debate over whether to include language acknowledging that artificial intelligence has been used in the editorial process. Some stations have chosen to attach such language to their online stories when this has been the case.

Obviously, this practice is more difficult to practice during a live broadcast. We have heard the argument that, because AI seems to be in everything these days, the audience either assumes it is used in preparing the news or doesn’t care if it is. Some would argue that the more transparency, the better, especially in an age when many believe in “fake news.”

As of this writing, there are no specific FCC rules regarding the use of AI in broadcasting. The closest corollary might be the rules that have been in place since the early days of radio regarding the use of “mechanical reproduction.” Live music was a staple of early radio broadcasting, be it singers, orchestras, or everything musical in between. Recording technology was very primitive at the time. But as that technology improved, there were concerns that radio stations would try to mislead audiences into believing that all musical performances were live, when they were not. By the end of radio’s first decade in the early 1930s, the FCC required the use of the “mechanical reproduction announcement” on stations to prevent any confusion. These announcements would be along the lines of “portions of today’s programming were reproduced by means of electrical transcription or magnetic tape.” 

When first adopted, the FCC required such an announcement to be aired every half-hour during any recorded programming. By 1956, the FCC amended its rules so that such announcements were only needed on programming "in which the element of time is of special significance and presentation of which would create, either intentionally or unintentionally, the impression or belief on the part of the listening audience that the event or program being broadcast is in fact occurring simultaneously with the broadcast.” 

That said, we recall having to air such an announcement at the start of each broadcast day during our early career stint as a master control operator. 

We would offer that having an “AI Creation Announcement” at the end of each newscast could serve much the same purpose. An on-screen graphic stating “Portions of this newscast created with the assistance of artificial intelligence writing tools” (or something along those lines) would seem a solid step towards complete transparency with viewers.

(At least until the adoption of "virtual anchors” requires that disclaimer be a bit more extensive.)

Newsroom purists, we’re sorry, but we don’t see the use of AI-powered tools going away in the future. We also don’t see a time when human-powered journalism will be replaced entirely in the editorial process. Sure, AI is still very capable of making mistakes, typically called by the interesting terminology of “hallucinations.” Humans, particularly in the still-vital review phase of the editorial process, are still the last line of defense.

Just as they have been with every form of automation that has come to the business.

A reminder that, before his years in acting and politics, Ronald Reagan was a broadcaster. First with WHO Radio in Des Moines and later as the play-by-play voice of the Chicago Cubs. (A screen test while in Southern California to cover the Cubs’ Spring Training led to his career as an actor.) While at WHO, “Dutch” Reagan, as he was known, gained notice for his “live play-by-play baseball broadcasts.” He created these broadcasts by reading the details off the wire services as if he were watching the games.

There is no record we can find as to whether he told listeners that he really wasn’t.

-30-

(Errata: The original version of this article referred to “the war to end all wars” in reference to World War II, when in fact that quote was actually about World War I. We regret the error and thank those readers who alerted us to it.)