“The spread of misinformation and targeted intimidation of Black voters will continue without the proper safeguards,” said Color of Change.
Racial justice defenders on Monday renewed calls for banning artificial intelligence in political advertisements after backers of former U.S. President Donald Trump published fake AI-generated images of the presumptive Republican nominee with Black “supporters.”
BBC highlighted numerous deepfakes, including one created by right-wing Florida radio host Mark Kaye showing a smiling Trump embracing happy Black women. On closer inspection, missing or misformed fingers and unintelligible lettering on attire expose the images as fake.
“I’m not claiming it’s accurate,” Kaye told the BBC. “I’m not a photojournalist. “I’m not out there taking pictures of what’s really happening. I’m a storyteller.”
These are fake AI generated images being posted by Trump supporters in an effort to garner support among Black voters. pic.twitter.com/45OL8rx82J
— Mike Sington (@MikeSington) March 4, 2024
“If anybody’s voting one way or another because of one photo they see on a Facebook page, that’s a problem with that person, not with the post itself,” Kaye added.
Another deepfake shows Trump on a porch surrounded by young Black men. The image earned a “community note” on X, the Elon Musk-owned social media platform formerly known as Twitter, identifying it as AI-generated. The owner of the account that published the image—which has been viewed more than 1.4 million times according to X—included the deceptive caption, “What do you think about Trump stopping his motorcade to take pictures with young men that waved him down?”
When asked about his image by the BBC, @MAGAShaggy1958 said his posts “have attracted thousands of wonderful kind-hearted Christian followers.”
Responding to the new reporting, the racial justice group Color of Change led calls to ban AI in political ads.
2/3 Not only has the racial justice community been saying this would happen — we’ve also been demanding election protections from #BigTech companies.
The spread of misinformation and targeted intimidation of Black voters will continue without the proper safeguards @Meta.
— ColorOfChange (@ColorOfChange) March 4, 2024
“The spread of misinformation and targeted intimidation of Black voters will continue without the proper safeguards,” the group said on social media, while calling for:
- Banning AI from political ads;
- Requiring disclosure of AI use for all other content;
- Banning deepfakes; and
- Restoring prohibitions on misinformation and lies about the validity of the 2020 election.
“As the 2024 election approaches, Big Tech companies like Google and Meta are poised to once again play a pivotal role in the spread of misinformation meant to disenfranchise Black voters and justify violence in the name of right-wing candidates,” Color of Change said in a petition urging Big Tech to “stop amplifying election lies.”
“During the 2016 and 2020 presidential election cycles, social media platforms such as Twitter, Facebook, YouTube, and others consistently ignored the warning signs that they were helping to undermine our democracy,” the group continued. “This dangerous trend doesn’t seem to be changing.”
“Despite their claims that they’ve learned their lesson and are shoring up protections against misinformation ahead of the 2024 election cycle,large tech companies are cutting key staff that moderate content and removing election protections from their policies that are supposed to safeguard platform users from misinformation,” the petition warns.
Last September, Sens. Amy Klobuchar (D-Minn.), Chris Coons (D-Del.), Josh Hawley (R-Mo.), and Susan Collins (R-Maine) introduced bipartisan legislation to prohibit the use of AI-generated content that falsely depicts candidates in political ads.
In February, the Federal Communications Commission responded to AI-generated robocalls featuring President Joe Biden’s fake voice telling New Hampshire voters to not vote in their state’s primary election by prohibiting the use of voice cloning technology to create automated calls.
The Federal Election Commission, however, has been accused by advocacy groups including Public Citizen of foot-dragging in response to public demands to regulate deepfakes. Earlier this year, FEC Chair Sean Cooksey said the agency would “resolve the AI rulemaking by early summer”—after many state primaries are over.
At least 13 states have passed laws governing the use of AI in political ads, while tech companies have responded in various ways to the rise of deepfakes. Last September, Google announced that it would require the prominent disclosure of political ads using AI. Meta, the parent company of Facebook and Instagram, has banned political campaigns from using its generative AI tools. OpenAI, which makes the popular ChatGPT chatbot, said earlier this year that it won’t let users create content for political campaigns and will embed watermarks on art made with its DALL-E image generator.
Cliff Albright, co-founder of the Black Voters Matter campaign, told the BBC that “there have been documented attempts to target disinformation to Black communities again, especially younger Black voters.”
Albright said the deepfakes serve a “very strategic narrative” being pushed by a wide range of right-wing voices from the Trump campaign to social media accounts in a bid to woo African Americans.
Trump’s support among Black voters increased from just 8% in 2016 to a still-meager 12% in 2020. Conversely, a recent New York Times/Siena College survey of voters in six key swing states found that Biden’s support among African American voters has plummeted from 92% during the last election cycle to 71% today, while 22% of Black respondents said they would vote for Trump this year.
Trump’s attempts to win Black votes have ranged from awkward to cringeworthy, including hawking $400 golden sneakers and suggesting his mugshot and 91 criminal indictments appeal to African Americans.
Common Dream’s work is licensed under a Creative Commons Attribution-Share Alike 3.0 License. Feel free to republish and share widely.