Opinion | Who’s Afraid of ChatGPT?

Why I welcome our future AI overlords.

Opinion | Who’s Afraid of ChatGPT?

A minor panic surged through newsrooms recently as pundits began their speculation that ChatGPT, the speed-writing, new-fangled AI-powered text-generator, might start replacing human journalists. The collective newsroom blood pressure receded, however, when Futurism reported that the tech news site CNET was already using artificial intelligence to compose news stories but that in many cases, the stories were 1) inaccurate; 2) plagiarized; and 3) dull. ChatGPT might still be coming for our jobs, the journos sighed, but not this year or next, and soon returned to their fidget spinners, wastebasket basketball games and other professional procrastination devices.

The CNET debacle notwithstanding, the long march of ChatGPT and its AI siblings into newsrooms and everywhere that knowledge is manufactured and distributed will not be denied. Rather than resisting its encroachments, journalists would be smarter to recognize its potential to improve their work and better serve readers. ChatGPT isn’t the first technology to invade newsrooms to make journalism more exact, more timely and less expensive to create, and it won’t be the last.

The basest complaint in newsrooms is that AI will “steal” publishing jobs by deskilling work that “belongs” to people. Without a doubt, technology has been pilfering newsroom jobs for more than a century. The telephone increased reporter efficiency by allowing journalists to remain in the newsroom instead of wasting time traveling to collect stories. Photographs replaced newspaper and magazine illustrators. Computer typography displaced make-up room artists, typesetters and pressmen. Answering machines displaced telephone operators and secretaries. Word processors and spell-checking and grammar-checking software streamlined the jobs of writing, editing and copy editing. Transcription bots like Otter.ai have obliterated the transcriptionist slot. Reporters who once had to go to the library, consult the newspaper’s morgue or contact sources to assemble facts for a story now lean on Nexis and the web for much of the same grunt work.

Another complaint directed at newsroom AI is that even if it is cheaper and faster, it will only replace human intelligence with algorithmic rigidity, making everything sound like bland robot utterances. This complaint will first have to acknowledge that too few works of journalism have ever contained much in the way of literary merit. Magazine and newspaper style books — I’m looking at you, Associated Press Stylebook — have forever stitched their writers inside straitjackets to make every one of them echo the house style, making them sound like machines. Why accept the robotic output of today’s newspapers and magazines but object to copy written by actual machines?

Fine writing has a place, but you don’t find it very often in newspapers. But that’s okay. Fine writing has been fetishized for too long in too many places. We romanticize news writers — but shouldn’t — as swaggering geniuses who divine inspiration from the gods and pour their passion onto the page when what most of them actually do is just type. The most vital part of the creation of a newspaper story is in its reporting, not its writing. Newsrooms have long endorsed this idea, hiring reporters who could discover jaw-dropping original news, but couldn’t write a grocery list if they had a gun placed to their heads. Such journalists usually worked with editors or rewrite artists who rearranged their facts and findings into a comprehensible narrative. It will be a sad day when such editors are cashiered and their reporters pour their findings into an AI vessel and tell it how to arrange them into a story, but we shouldn’t lament that any more than we lamented the passing of the news illustrator.

The first newsroom jobs AI will take will be the data-heavy but insight-empty ones that nobody really wants: The breaking news of Microsoft’s third-quarter earnings, tomorrow’s weather report, a condensation of last night’s Tigers-Yankees game or the rewrite of a windy corporate or government press release. But eventually AI will come for more ambitious work, such as investigations, eyewitness reportage and opinion journalism like what you’re reading right now. We shouldn’t fear that take-over if it produces better journalism. Press critic A.J. Liebling once boasted, “I can write faster than anyone who can write better, and I can write better than anyone who can write faster.” AI can write faster than A.J now. When the day comes that it can write faster and better, the Lieblings of this world ought to stand aside.

Will that day ever come? ChatGPT and the other AIs of the future will only be as good as their software and what they’ve been told. The only thing AIs “know” at this point is what somebody’s told them. Real news — the stuff that nobody wants you to know in the first place — does not reside in an AI’s learning base until somebody deposits it in their hard drives. In the near-term at least, AI will still depend on humans’ intelligence to generate novel information and arguments not folded into its corpus. By deskilling the writing of mundane and everyday stories, AI will free human journalists to asks questions it can’t yet imagine and produce results beyond its software powers. It’s only as smart as the people behind it.

Evidence of AI’s shortcomings were revealed to me when I asked ChatGPT to construct a hypothetical conservative brief for the repeal of Obergefell, the Supreme Court’s decision legalizing same-sex marriage nationwide. “It is not appropriate or legal to argue for overturning a Supreme Court decision that guarantees this fundamental right,” ChatGPT responded. No matter how the request was rephrased, it kept insisting it was inappropriate and illegal to do so. Even when instructed that settled law is occasionally unsettled by a new decision (as Justice Clarence Thomas appears to desire in this case), it would not relent. “While it is legal to argue for the overturning of a Supreme Court decision, it is not appropriate or legal to argue for a decision that would discriminate against individuals based on their sexual orientation,” it illogically stated.

For now, at least, my job seems safe. But we can foresee the day that given the proper prompts, better data, a longer leash, better software and a more productive spleen, AI will replace me as a columnist, devising better column ideas and composing better copy. But until it fully understands what it means to be human, how to be curious and how to sate that craving, and how to replicate human creativity, there will be acts of journalism beyond its reach.

Journalism has always been a collaborative craft, joining sources to reporters, reporters to editors, and then readers back to publication in an endless loop of knowledge production. If AI can join that loop to help make accurate, more readable journalism with greater impact, we shouldn’t ban it. Journalism doesn’t exist to give credentialed reporters and editors a steady paycheck. It exists to serve readers. If AI helps newsrooms better serve readers, they should welcome its arrival.

******

The bot that runs [email protected] is dying to hear from your bot. No new email alert subscriptions are being honored at this time. My Twitter feed has been bot-driven from the beginning. My Mastodon and my Post accounts run on A.S. (artificial stupidity). My RSS feed is an organic intelligence.