Nouvelles Du Monde

OpenAI présente son nouvel outil : Voice Engine, un outil de clonage de voix assujetti à des restrictions

OpenAI présente son nouvel outil : Voice Engine, un outil de clonage de voix assujetti à des restrictions

OpenAI, a giant in generative artificial intelligence (AI) and the publisher of ChatGPT, presented on Friday a voice cloning tool, the use of which will be restricted to prevent fraud or crimes, such as identity theft.

Named “Voice Engine,” this AI model can reproduce a person’s voice from a 15-second audio sample, according to a statement from OpenAI on the results of a small-scale test.

“We acknowledge that the ability to generate voices resembling those of people poses serious risks, which are particularly important in this election year,” said the San Francisco-based company.

“We are working with American and international partners from government, media, entertainment, education, civil society, and other sectors and taking their feedback into account as we develop the tool.”

In this crucial election year worldwide, disinformation researchers fear the abusive use of generative AI applications (automated production of texts, images, etc.), and especially voice cloning tools, which are cheap, easy to use, and difficult to trace.

Lire aussi  Les premiers appareils Xiaomi qui bénéficieront de la nouvelle surcouche HyperOS chez nous début 2024

OpenAI has assured that it has adopted “a cautious and informed approach” before a wider distribution of the new tool “due to the potential for misuse of synthetic voices.”

This cautious presentation comes after a major political incident, when a consultant working for a Democratic presidential rival of Joe Biden’s campaign developed an automated program that impersonated the US president, campaigning for re-election.

The voice imitating Joe Biden called voters to encourage them to abstain from voting in the New Hampshire primary.

The United States has since banned calls with cloned voices, generated by AI, in order to combat political or commercial scams.

OpenAI specified that partners testing “Voice Engine” have accepted rules requiring explicit and informed consent from anyone whose voice is duplicated and transparency for listeners: they must clearly know that the voices they hear are generated by AI.

“We have implemented a set of security measures, including a watermark to trace the origin of any sound generated by Voice Engine, as well as proactive monitoring of its use,” insisted OpenAI.

Lire aussi  Yamaha passe au haut de gamme avec les nouveaux haut-parleurs d'étagère et récepteurs réseau NS-600A et NS-800A

Last October, the White House unveiled rules and principles to regulate the development of AI, including transparency.

Joe Biden was concerned that criminals would use it to deceive people by pretending to be members of their family.

#OpenAI #présente #outil #clonage #voix
publish_date] pt]

Facebook
Twitter
LinkedIn
Pinterest

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

ADVERTISEMENT