Facebook’s lax stance on deepfake audio sparks concerns ahead of elections: Parmy Olson

Facebook’s lax stance on deepfake audio sparks concerns ahead of elections: Parmy Olson

As misinformation spreads, Facebook's leniency toward deepfake audio raises concerns, emphasising the need for updated policies
Published on

In the time it takes to toast bread, one can easily clone the voice of President Joe Biden, creating misleading audio content. While platforms like TikTok and YouTube swiftly remove such clips, Facebook, with three billion users, merely slaps a warning label on them, risking potential havoc in election years. Fake audio, generated by advanced AI tools, is harder to scrutinise than manipulated videos. As misinformation spreads, Facebook's leniency raises concerns, emphasising the need for updated policies in the face of evolving technology and deceptive practices.

Sign up for your early morning brew of the BizNews Insider to keep you up to speed with the content that matters. The newsletter will land in your inbox at 5:30am weekdays. Register here.

Facebook's Tolerance for Audio Deepfakes Is Absurd: Parmy Olson

By Parmy Olson

In the same amount of time it would take to toast a slice of bread, you could clone the voice of US President Joe Biden and share it on social media. You could have him mutter in his slow and gravelly voice: "I've always known Covid-19 was a hoax, it's just useful to pretend it's real," then superimpose the audio on a photo of the president grinning, upload it to TikTok, YouTube and Facebook, and wait.

___STEADY_PAYWALL___

Loading content, please wait...

Related Stories

No stories found.
BizNews
www.biznews.com