“The potential harms to our society that could result from such misinformation, including abuses of our elections, are wide-reaching and immensely damaging.”
U.S. Republican presidential nominee Donald Trump’s posting of artificial intelligence-generated images suggesting pop star Taylor Swift had endorsed him earned him the ridicule of Swift’s fans and journalists this week, but at least one pro-democracy watchdog warned that the incident served as a reminder that regulators must act with much more urgency to stop such behavior.
On Sunday, Trump used his Truth Social account to share the AI-generated images, known as “deepfakes,” one of which showed Swift dressed up as Uncle Sam with the caption, “Taylor Wants You to Vote for Donald Trump.” Others showed people wearing shirts that read, “Swifties for Trump.”
Trump wrote, “I accept,” in his post sharing the images, suggesting he was accepting a formal endorsement.
“The AI-generated deepfakes of Taylor Swift are yet another example of AI’s power to create misinformation that deceives and defrauds voters,” said Lisa Gilbert, co-president of Public Citizen. “The potential harms to our society that could result from such misinformation, including abuses of our elections, are wide-reaching and immensely damaging.”
The images represent the latest escalation in Trump’s fixation with Swift, who has made headlines over the last year with her record-breaking Eras Tour and who publicly supported President Joe Biden in 2020.
Far-right commentators melted down earlier this year as misinformation and baseless conspiracy theories spread about Swift and her boyfriend, Kansas City Chiefs tight end Travis Kelce, being part of a “deep state” plan to secure a Democratic victory in November.
Swift has made her liberal political views well-known in recent years but has yet to announce an endorsement in the 2024 presidential race. She condemned Trump’s violent rhetoric during nationwide racial justice protests in 2020, and accused him of “stoking the fires of white supremacy and racism.”
“The AI-generated deepfakes of Taylor Swift are yet another example of AI’s power to create misinformation that deceives and defrauds voters.”
Her criticism appears to have irritated Trump for years, with the former president complaining earlier this year that a failure by Swift to endorse him would be “disloyal” because he signed legislation making it easier for musical artists to collect royalties from streaming platforms.
Political observers have speculated that this year, repeated criticism of so-called “childless cat ladies” by Trump’s running mate, Sen. JD Vance (R-Ohio), as well as the Republicans’ support for forced pregnancy laws, will not help the GOP ticket win over Swift’s large fan base.
Trump’s use of deepfake images comes after Public Citizen has spearheaded calls for the Federal Elections Commission to regulate AI-generated images; earlier this month, the FEC’s Republican chair, Sean Cooksey, announced the agency would not establish rules prohibiting political candidates or groups from misrepresenting opponents or issues with deceptive images.
Prior to Trump sharing the fake images of Swift and her fans on Sunday, billionaire Tesla owner Elon Musk had posted a deepfake video featuring a manipulated image of Democratic presidential nominee Kamala Harris.
Public Citizen co-president Robert Weissman said earlier this month that political deepfakes “are rushing at us, threatening to disrupt electoral integrity.”
“Requiring that political deepfakes be labeled doesn’t favor any political party or candidate,” he said. “It simply protects voters from fraud and chaos.”
Common Dream’s work is licensed under a Creative Commons Attribution-Share Alike 3.0 License. Feel free to republish and share widely.