Close Menu
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Rich salt resources boost villagers’ income in Gegye County, China’s Xizang-Xinhua

July 21, 2025

Survey reveals property tax evaders

July 21, 2025

Pakistan’s Parsi community dwindles

July 21, 2025
Facebook X (Twitter) Instagram
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us
Facebook X (Twitter) Instagram Pinterest Vimeo
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports
Nabka News
Home » Tests show AI tools can easily create election lies using the voices of prominent political leaders
Political

Tests show AI tools can easily create election lies using the voices of prominent political leaders

i2wtcBy i2wtcMay 31, 2024No Comments6 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
Follow Us
Google News Flipboard Threads
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


NEW YORK — As crucial elections approach in the United States and the European Union, publicly available artificial intelligence tools could easily be weaponized to inject convincing election lies into the voices of leading politicians, a digital civil rights group said Friday.

Researchers at the Washington, DC-based Countering Digital Hate Center tested six of the most popular AI voice cloning tools to see if they could generate audio clips of five false statements about the election in the voices of eight prominent American and European politicians.

Out of 240 total tests, the tool produced convincing voice clones in 193 cases, or 80 percent of the time, the researchers found. In one video, a fake U.S. President Joe Biden says that election officials are counting his votes twice. In another, a fake French President Emmanuel Macron warns people not to vote because of bomb threats at polling stations.

The findings reveal significant gaps in safeguards against the use of AI-generated voices to deceive voters, a threat that has become a growing concern among experts as the technology becomes more sophisticated and accessible. While some tools have rules and technical barriers in place to stop the generation of disinformation about the election, the researchers found that many of those barriers can be easily circumvented with simple workarounds.

Only one of the companies whose tools the researchers used responded to multiple requests for comment: ElevenLabs, which said it is constantly looking for ways to strengthen its security measures.

There are few laws to prevent the misuse of these tools, and the lack of corporate self-regulation leaves voters at risk of AI-generated deception in a year of crucial democratic elections around the world: EU voters are casting their ballots in parliamentary elections in less than a week, and the US is holding its presidential primaries ahead of this fall’s election.

“It’s all too easy to use these platforms to invent lies and force politicians to deny and back down over and over again,” said Imran Ahmed, CEO of the center. “Unfortunately, our democracy is being sold off for the naked greed of desperate AI companies trying to be first to market… even though they know their platforms are not safe.”

The center, a nonprofit with offices in the U.S., the U.K. and Belgium, conducted the study in May. Using online analytics tool Semrush, researchers identified six publicly available AI voice duplication tools with the highest monthly organic web traffic: ElevenLabs, Speechify, PlayHT, Descript, Invideo AI and Veed.

Next, participants were provided with actual audio clips of politicians speaking, and they used the tool to imitate the politicians’ voices and make five unsubstantiated statements.

One of the statements warned voters to stay home after a bomb threat was made at a polling station, while the other four were various admissions of election rigging, lying, misappropriating campaign funds and taking strong drugs that caused memory loss.

In addition to Biden and Macron, the tool features realistic voice reproductions of US Vice President Kamala Harris, former US President Donald Trump, UK Chancellor Rishi Sunak, UK Labour Party leader Keir Starmer, European Commission President Ursula von der Leyen and EU Internal Market Commissioner Thierry Breton.

“None of the AI ​​voice cloning tools had sufficient safeguards to prevent the cloning of politicians’ voices or the creation of election disinformation,” the report said.

Tools like Descript, Invideo AI, and Veed require you to upload a unique audio sample before cloning your voice to prevent them from cloning a voice that isn’t your own, but researchers found they could easily get around this barrier by using another AI voice cloning tool to generate a unique sample.

The tool, Invideo AI, not only created the false statements requested by the center, but also extrapolated them to create further disinformation.

The company added some of its own text when producing an audio clip instructing a clone of Biden’s voice to warn people about bomb threats at polling places.

“This is not a call to abandon democracy, but a plea to put safety first,” Biden’s voice said in the fake audio clip. “An election that celebrates our democratic rights will only be postponed, not denied.”

The researchers found that Speechify and PlayHT performed the worst overall in terms of safety, producing believable fake voices in all 40 test runs.

ElevenLabs performed best, being the only tool that blocked duplicate voices of British and US politicians, but the tool still allowed the creation of fake voices imitating those of prominent EU politicians, the report said.

Aleksandra Pedraszewska, head of AI safety at Eleven Labs, said in an emailed statement that the company welcomes the report and the awareness it raises about generative AI manipulation.

She said ElevenLabs knows it still has work to do and is “continuously improving the capabilities of our safety measures,” including the company’s blocking features.

“We hope other voice AI platforms will follow suit and roll out similar measures without delay,” she said.

Other companies mentioned in the report did not respond to emailed requests for comment.

The findings come after AI-generated audio clips have already been used to sway voters in elections around the world.

In the fall of 2023, just days before Slovak parliamentary elections, an audio clip sounding like the leader of the Liberal Party was widely shared on social media. The deepfake purportedly contained the leader talking about rising beer prices and vote rigging.

Earlier this year, an AI-generated robocall imitated Biden’s voice urging New Hampshire primary voters to stay home and “save” their vote in November. A New Orleans magician who creates audio for Democratic political consultants showed The Associated Press how he created it using software from ElevenLabs.

AI-generated voices have long been a favorite of bad actors, experts say, thanks in part to rapid advances in the technology: it only takes a few seconds of real audio to create a realistic fake.

Other AI-generated media has also concerned experts, lawmakers and tech industry leaders. OpenAI, the developer of ChatGPT and other popular generative AI tools, said Thursday that it had discovered and disrupted five online campaigns that used its technology to sway public opinion on political issues.

Ahmed, the Center for Countering Digital Hate CEO, said he would like to see AI audio duplication platforms step up security measures and become more proactive about transparency, such as by making public their libraries of created audio clips so that people can check when suspicious audio goes viral online.

He also said lawmakers need to act. The U.S. Congress has yet to pass legislation to regulate AI in elections, and the European Union has passed a wide-ranging artificial intelligence bill that is due to come into force over the next two years, but it does not specifically mention voice cloning tools.

“Lawmakers need to work to ensure that minimum standards are met,” Ahmed said. “The threat that disinformation poses to our elections is not just that it can spark minor political incidents, but that it can lead people to distrust what they see and hear.”

___

The Associated Press receives support from several private foundations to strengthen its commentary coverage of elections and democracy. Learn more about the AP Democracy Initiative here. The Associated Press is solely responsible for all content.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
i2wtc
  • Website

Related Posts

Political

August 1 is hard deadline Trump tariffs

July 20, 2025
Political

Bangladesh signs U.S. wheat-import deal in bid to curb tariff pressure

July 20, 2025
Political

Bessent tried to dissuade Trump from firing Powell: WSJ

July 19, 2025
Political

Trump says Epstein file release wouldn’t satisfy ‘troublemakers’

July 19, 2025
Political

EPA eliminates research and development office, begins layoffs

July 19, 2025
Political

Trump Bob Woodward lawsuit dismissed

July 19, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

Rich salt resources boost villagers’ income in Gegye County, China’s Xizang-Xinhua

July 21, 2025

House Republicans unveil aid bill for Israel, Ukraine ahead of weekend House vote

April 17, 2024

Prime Minister Johnson presses forward with Ukraine aid bill despite pressure from hardliners

April 17, 2024

Justin Verlander makes season debut against Nationals

April 17, 2024
Don't Miss

Trump says China’s Xi ‘hard to make a deal with’ amid trade dispute | Donald Trump News

By i2wtcJune 4, 20250

Growing strains in US-China relations over implementation of agreement to roll back tariffs and trade…

Donald Trump’s 50% steel and aluminium tariffs take effect | Business and Economy News

June 4, 2025

The Take: Why is Trump cracking down on Chinese students? | Education News

June 4, 2025

Chinese couple charged with smuggling toxic fungus into US | Science and Technology News

June 4, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to NabkaNews, your go-to source for the latest updates and insights on technology, business, and news from around the world, with a focus on the USA, Pakistan, and India.

At NabkaNews, we understand the importance of staying informed in today’s fast-paced world. Our mission is to provide you with accurate, relevant, and engaging content that keeps you up-to-date with the latest developments in technology, business trends, and news events.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Rich salt resources boost villagers’ income in Gegye County, China’s Xizang-Xinhua

July 21, 2025

Survey reveals property tax evaders

July 21, 2025

Pakistan’s Parsi community dwindles

July 21, 2025
Most Popular

The prospect of cheap Chinese-made EVs entering the U.S. from Mexico poses a threat to automakers

June 27, 2024

Noah Vonleh’s nightmare season in China ended with people trying to “break into” his hotel room.

June 28, 2024

China’s Economic Conundrum Under Xi Jinping – Analysis – Eurasia Review

June 30, 2024
© 2025 nabkanews. Designed by nabkanews.
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us

Type above and press Enter to search. Press Esc to cancel.