For decades, as the glossy standard-bearer of sports journalism enjoyed by millions of readers, Sports Illustrated served up high-quality writing, stunning photographs and precisely one regular controversy every year with the publication of its wintertime Swimsuit Edition.
But this week S.I. was at the centre of another storm, when a news report accused the iconic outlet of publishing stories that were generated by artificial intelligence, under bylines and headshots of writers who themselves seemed to be fabricated by machines.
On Monday, the online science and tech outlet Futurism reported it had found a series of product reviews on the S.I. website that had been published last summer, initially under the byline of a writer who appeared to have no other online presence – with an author headshot found to be for sale on a site offering AI-generated stock photos – and then, sometime later, replaced with a different byline and author photo that were similarly fake.
When confronted with the allegations, S.I. deleted the content in question. Later, a spokesperson for The Arena Group, which operates S.I., told Futurism the material had been produced by a third-party marketing agency as part of a licence agreement that was being reviewed. According to Arena, that agency, AdVon Commerce, denied the content had been generated by AI and explained that fake bylines had been used in some cases “to protect author privacy.”
While Sports Illustrated was already a shadow of its former self – long past its glory days, when it published legends such as Gary Smith, Frank Deford and George Plimpton – the report stunned the sports journalism community. The publication’s unionized staff issued a statement declaring they were “horrified” by the allegations, and “deplore being associated with something so disrespectful to our readers.” (The statement was signed, “The Humans of the SI Union.”)
The New York Times later reported that Arena had terminated its partnership with AdVon.
On Tuesday, the share price of The Arena Group fell almost 27 per cent on the New York Stock Exchange, closing at US$2.01.
But Arena is not alone. The report comes as sports journalism itself is in the midst of an expanding fake news crisis, as the industry tangles with AI’s thorny temptations for a business model under extreme pressure.
Last month, ESPN aired an interview with NBA star Damian Lillard after his first game with the Milwaukee Bucks, which some viewers thought looked weird: He wasn’t wearing the same uniform he’d had on during the game, and there were no fans in the stands. Sure enough, the video had been doctored and repurposed from a 2020 interview Mr. Lillard had given in the NBA Bubble, the Portland Blazers jersey he’d worn at the time clumsily removed and replaced.
That came in the wake of the U.S. newspaper chain Gannett pausing its use of AI-generated stories that covered high school games, after readers mocked them for ham-fisted writing that sounded as if the author didn’t have any knowledge of sports. (Insofar as AI doesn’t actually “know” anything, this was true.)
And last April, the German magazine Die Aktuelle published a piece it promoted on the cover as an interview with Michael Schumacher, the former F1 champion who hasn’t been seen in public since suffering a head injury in 2013 while skiing. Though readers were informed inside the magazine that Mr. Schumacher’s quotations had been generated by AI, threats of a lawsuit by Mr. Schumacher’s family and public blowback prompted the magazine to fire the editor.
The S.I. stories were similarly cringe-worthy, featuring the sort of awkwardly formulated sentences for which AI is increasingly known. One piece, a listing of seven volleyballs the reader could click on and buy through Amazon, contained clunky language; the curious declaration that “Volleyballs aren’t as complicated as many people think;” and a suggestion that the Wilson volleyball brand – one of which famously played a supporting role in the Tom Hanks movie Cast Away – is “not exactly the most famous in the sport, it is well-known enough to be recognized by professional players.”
“We’re thinking about AI as omniscient, and in reality it’s like a dumb intern: Everything it does has to be checked, it’s often wrong, it’s a bad writer, it’s cliché-ridden,” said John Affleck, the Knight Chair in Sports Journalism and Society at the Donald P. Bellisario College of Communications. “But here’s the thing about dumb interns: They get smarter as time goes on. So, watch how they grow.”
While the news of S.I.’s misstep was greeted with shock, its parent company had already announced that AI would be an important part of its future, telling The Wall Street Journal last February that a number of fitness-related articles published under the Men’s Journal banner, which it also owns, had been produced by mining the magazine’s archives.
Prof. Affleck believes the technology could be used responsibly to benefit both the industry and the audience it serves. In the U.S., the Associated Press began using AI in 2019 to produce previews for NCAA basketball games.
Similarly, the core of postgame wrap-ups, which are often rote affairs that simply summarize the scoring, could be produced by AI, freeing up a reporter to add their own unique observations from the match. That could lead to stories that emphasize “the value of being in a place as a human, and what we can provide a reader that they’re not going to get from a robot, or artificial intelligence.”
On Tuesday, Prof. Affleck noted that he teaches a course in sports journalism in which he devotes a single class each year to a discussion of ethics. Most years, he said, the discussion covers issues such as “swag and the negotiations that go on about things that are on the record or off the record, and whether reporters should take part in Hall of Fame votes – that sort of thing.”
This year’s ethics class is set for Thursday. “We’re obviously going to have a different discussion,” he said.
No custom component found for subtype: oovvuu-video