Sports Illustrated caught passing off AI content as human

by ARKANSAS DIGITAL NEWS


The rapid integration of artificial intelligence into every corridor of society has created a surreal state of journalism in 2023. In its infancy, everyone is feeling around in the dark, stumbling over AI-generated content. A bevy of outlets, this site included, have dabbled in AI-generated content. Conversely, major sites have injected code that blocks OpenAI webcrawler GPTBot for scanning their sites for content. Simply put, the debate over AI-generated content has only just begun.

However, Sports Illustrated, which spent decades building its reputation on reporting, long-form journalism and 70 years as an industry leader, took liberties with artificial intelligence that went far afield of current media standards. In the process of trying to sidestep the aforementioned debate, they burned their own reputations.

Four decades ago, venerable reporter George Plimpton penned Sports Illustrated’s infamous April Fools cover story chronicling fictional Mets prospect Sidd Finch’s legendary pitching exploits. Now, imagine if SI’s current management, The Arena Group, went to extensive lengths to hide that Plimpton wasn’t an actual living, breathing human being and that the story they published was written by an ever-learning artificial intelligence trained on the intellectual property produced by organic beings?

Well, that’s an approximation of what The Arena Group did by conjuring fugazi writer bylines such as “Drew Ortiz” and “Sora Tanaka”. For months, they passed off AI stories as content written by staff writers with made-up bios, and rotated them out with other fictional staff writers with made-up bios to avoid detection. Their bios, according to Futurism, read like the sort of generic happy-go-lucky dorks AI probably imagines humans are. Ortiz’s biography described himself as the outdoorsy type, “excited to guide you through his never-ending list of the best products to keep you from falling to the perils of nature.”

Meanwhile, Tanaka, “has always been a fitness guru, and loves to try different foods and drinks.” The Arena Group also did the same with bylines for TheStreet, flaunting expert writers who were not only fictional, but also dispensed bad personal finance advice. I’m surprised they didn’t get around to digging up C-Span screenshots of Mina Kimes discussing railroad monopolies to gain trust from their readers. This entire operation was the AI content generation analog of the Steve Buscemi undercover cop infiltrating high school meme. “How do you do, fellow humans?”

Much like Sidd Finch, it turns out Ortiz and Tanaka are fictional identities fabricated by the Arena Group to create the illusion of a meat-and-bones writing staff. As part of their efforts, The Arena Group bought pictures for their fictional writers off of an AI headshot marketplace, which is concerning in itself. I don’t know what the legal precedent is for AI headshots that closely resemble public figures, but Luka Doncic should definitely be calling his lawyers because prominent botwriter Drew Ortiz bears a strong resemblance to the Mavs forward.

AI-generated content is unpopular enough, but it’s not exactly unethical. However, it definitely shouldn’t be done behind a veil, or a second-rate Luka. If driverless vehicle technology ever advanced to the point that companies began competing with human taxi or Uber drivers, passengers would want a choice in knowing who they’re riding with and who they’re supporting. AI generated-content is the media’s untested driverless car swerving through these Google-run streets. The Arena Group is akin to a reckless ride-hailing company trying to bamboozle its readers into believing their driver is a human. It sounds stranger than fiction, but these are the times we are in.

This was beyond goofy professional execution, though. Once the jig was up and Futurism reached out for comment, The Arena Group launched a cartoonishly duplicitous cover-up by attempting to delete most of the content generated by their fictional writers.

The entire industry is still trying to bungle their way through this avant-garde terrain, but bylines still denote credibility – or lack thereof. How are readers supposed to discern what is what and trust the Fourth Estate if media brass backs misleading their readers about where their content derives from? People want to know if they’re reading Albert Breer or an amalgamation of internet voices designed to sound like him. All The Arena Group did was engender mistrust in their readers by engaging in dishonest practices. Nothing good can come of it. Especially at a time when the industry is facing uncertainty and attacks from external influences.

On Monday evening, Variety reported that The Arena Group had ended its partnership with Advon Commerce, the third-party provider who supplied the branded content. But who knows how far this would have gone if not for human reporting? AI-generated SI Swimsuit Issue cover models? On second thought, maybe I shouldn’t give them any ideas considering future AI-generated editors could be scanning this for ideas.

Follow DJ Dunson on X: @cerebralsportex 





Source link

Related Posts